US12025981B2 - System and method for automatically detecting erratic behaviour of another vehicle with a vehicle's autonomous driving system - Google Patents
System and method for automatically detecting erratic behaviour of another vehicle with a vehicle's autonomous driving system Download PDFInfo
- Publication number
- US12025981B2 US12025981B2 US16/796,334 US202016796334A US12025981B2 US 12025981 B2 US12025981 B2 US 12025981B2 US 202016796334 A US202016796334 A US 202016796334A US 12025981 B2 US12025981 B2 US 12025981B2
- Authority
- US
- United States
- Prior art keywords
- vehicle
- image
- processor
- autonomous
- above method
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0011—Planning or execution of driving tasks involving control alternatives for a single driving scenario, e.g. planning several paths to avoid obstacles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
- B60W30/182—Selecting between different operative modes, e.g. comfort and performance modes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
- B60W30/12—Lane keeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/14—Adaptive cruise control
- B60W30/16—Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
- B60W30/165—Automatically following the path of a preceding lead vehicle, e.g. "electronic tow-bar"
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0287—Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/22—Command input arrangements
- G05D1/221—Remote-control arrangements
- G05D1/227—Handing over between remote control and on-board control; Handing over between remote control arrangements
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/60—Intended control result
- G05D1/69—Coordinated control of the position or course of two or more vehicles
- G05D1/692—Coordinated control of the position or course of two or more vehicles involving a plurality of disparate vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/80—Arrangements for reacting to or preventing system or operator failure
- G05D1/81—Handing over between on-board automatic and on-board manual control
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/082—Learning methods modifying the architecture, e.g. adding, deleting or silencing nodes or connections
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/09—Supervised learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096725—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096733—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
- G08G1/09675—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where a selection from the received information takes place in the vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096775—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096791—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K31/00—Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator
- B60K31/0008—Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator including means for detecting potential obstacles in vehicle path
- B60K2031/0016—Identification of obstacles; Selection of a target vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K31/00—Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator
- B60K31/0008—Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator including means for detecting potential obstacles in vehicle path
- B60K2031/0025—Detecting position of target vehicle, e.g. vehicle driving ahead from host vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0062—Adapting control system settings
- B60W2050/0075—Automatic parameter input, automatic initialising or calibrating means
- B60W2050/0083—Setting, resetting, calibration
- B60W2050/0088—Adaptive recalibration
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- G05D2201/0213—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/10—Machine learning using kernel methods, e.g. support vector machines [SVM]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/048—Activation functions
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/22—Platooning, i.e. convoy of communicating vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
Definitions
- the present disclosure is generally directed to vehicle systems, in particular, toward autonomous vehicles.
- Driving a vehicle in particular, a semi-truck or load-carrying vehicle requires a great deal of attention from the driver.
- New vehicles and trucks are trying to alleviate some of the burden of driving by introducing some level of autonomy with the vehicle.
- Self-driving is one form of autonomy being developed.
- current self-driving vehicles require a great deal of hardware and a suite of different sensors to function effectively or safely. This additional hardware increases the cost for the self-driving/autonomous systems and slows adoption of this technology.
- FIG. 1 depicts a plurality of vehicles, in an environment, in accordance with embodiments of the present disclosure
- FIG. 2 A depicts a block diagram of an embodiment of a vehicle computing environment of a vehicle in accordance with embodiments of the present disclosure
- FIG. 2 B depicts a hardware/software configuration for a server or cloud computing function of a system, which may supply driving models, in accordance with embodiments of the present disclosure
- FIG. 2 C depicts a computing environment that may function as a server, user computer, or other system provided and described herein, in accordance with embodiments of the present disclosure
- FIG. 2 D depicts an example of a computer system upon which a server, computer, computing device, or other system or components may be deployed or executed in accordance with embodiments of the present disclosure
- FIG. 3 A depicts one or more software structures including one or more nodes and/or a data structure in accordance with embodiments of the present disclosure
- FIG. 3 B depicts one or more software structures including one or more nodes and/or a data structure in accordance with embodiments of the present disclosure
- FIG. 3 C depicts one or more software structures including one or more nodes and/or a data structure in accordance with embodiments of the present disclosure
- FIG. 4 depicts details of an autonomous model generation system in accordance with embodiments of the present disclosure
- FIG. 5 depicts additional details of one or more autonomous vehicle models that may reside at a fleet vehicle, database, and/or be generated by an autonomous vehicle model generation system in accordance with embodiments of the present disclosure
- FIG. 6 depicts additional details of an autonomous vehicle model in accordance with embodiments of the present disclosure
- FIG. 9 depicts a third method for autonomously driving a vehicle in accordance with embodiments of the present disclosure.
- FIG. 10 depicts a fourth method for autonomously driving a vehicle in accordance with embodiments of the present disclosure.
- FIG. 12 depicts a sixth method for autonomously driving a vehicle in accordance with embodiments of the present disclosure
- FIG. 13 depicts a method for determining an environmental concern for an autonomous vehicle in accordance with embodiments of the present disclosure
- FIG. 14 depicts a seventh method for autonomously driving a vehicle in accordance with embodiments of the present disclosure
- FIG. 16 depicts a method for automatically calibrating one or more image acquisition devices in accordance with embodiments of the present disclosure
- FIG. 17 depicts a method of calibrating one or more camera parameters over time in accordance with embodiments of the present disclosure
- FIG. 18 depicts a method for calibrating a first image acquisition device based on information received from a second image acquisition device in accordance with embodiments of the present disclosure
- FIG. 19 A depicts a method utilized to adjust one or more parameters of an autonomous vehicle based on an autonomous vehicle model in accordance with embodiments of the present disclosure
- FIG. 19 B depicts a method utilized to adjust one or more parameters of an autonomous vehicle based on an autonomous vehicle model in accordance with embodiments of the present disclosure
- FIG. 20 depicts one or more methods for receiving and/or requesting a new autonomous vehicle model in accordance with embodiments of the present disclosure
- FIG. 21 depicts an example method directed to updating one or more autonomous vehicle models in accordance with embodiments of the present disclosure
- FIG. 22 D depicts additional details related to object detection in accordance with embodiments of the present disclosure.
- FIG. 22 E depicts additional details related to object detection in accordance with embodiments of the present disclosure.
- FIG. 24 B depicts additional details related to following a lane in accordance with embodiments of the present disclosure
- FIG. 25 A depicts additional details related to defining a lane in accordance with embodiments of the present disclosure
- FIG. 25 B depicts additional details related to defining a lane in accordance with embodiments of the present disclosure
- FIG. 25 C depicts additional details related to defining a lane in accordance with embodiments of the present disclosure.
- FIG. 25 D depicts additional details related to defining a lane in accordance with embodiments of the present disclosure.
- FIG. 26 A depicts additional details related to calibration using an image in accordance with embodiments of the present disclosure
- FIG. 26 B depicts additional details related to calibration using an image in accordance with embodiments of the present disclosure
- FIG. 26 C depicts additional details related to calibration using an image in accordance with embodiments of the present disclosure
- FIG. 27 A depicts additional details of a data structure for storing and/or communication information in accordance with embodiments of the present disclosure
- FIG. 27 B depicts additional details of a data structure for storing and/or communication information in accordance with embodiments of the present disclosure.
- FIG. 28 depicts an example data structure 2800 in accordance with embodiments of the present disclosure.
- Embodiments of the present disclosure will be described in connection with a vehicle, and in accordance with at least some situations or configurations, a semi-truck or freight vehicle and associated systems.
- FIG. 1 shows a plan view of a vehicle 104 in an environment 100 , in accordance with embodiments of the present disclosure.
- the vehicle 104 comprises a vehicle front 110 , vehicle aft or rear 120 , vehicle roof 130 , at least one vehicle side 160 , a vehicle undercarriage 140 , and a vehicle interior 150 .
- the vehicle 104 may include one or more interior components inside an interior space 150 , exterior components on the exterior parts of the vehicle, systems to control the vehicle movement, vehicle speed, vehicle acceleration, vehicle deceleration, climate control, infotainment, other controls systems, structural components, etc.
- vehicle 104 may include any conveyance or model of a conveyance, where the conveyance was designed for the purpose of moving freight, such as people, animals, cargo, and the like, or as a conveyance of people.
- Vehicles may include, but are in not limited to, cars, trucks, motorcycles, busses, automobiles, trains, trams, other railed conveyances, boats, ships, marine conveyances, submarine conveyances, airplanes, space craft, flying machines, human-powered conveyances, and the like.
- the vehicle 104 may include a number of sensors, devices, and/or systems that are capable of assisting in driving operations with or without human input.
- the sensors and systems may include, but are in no way limited to, one or more of cameras and/or other optical sensors (e.g., independent, stereo, combined image, multispectral, etc.), infrared (IR) sensors, ultraviolet (UV) sensors, radio frequency (RF) sensors, ultrasonic sensors (e.g., transducers, transceivers, etc.), RADAR sensors (e.g., object-detection sensors and/or systems), LIDAR systems, odometer sensors and/or devices (e.g., encoders, etc.), orientation sensors (e.g., accelerometers, gyroscopes, magnetometer, etc.), navigation sensors and systems (e.g., GPS, etc.), and other ranging, imaging, and/or object-detecting sensors.
- IR infrared
- UV ultraviolet
- RF radio frequency
- ultrasonic sensors
- the sensors may be disposed in an interior space 150 of the vehicle 104 and/or on an outside of the vehicle 104 .
- the sensors and systems may also be disposed in one or more portions of a vehicle 104 (e.g., the frame, a body panel, a compartment, etc.).
- the vehicle sensors and systems may be selected and/or configured to suit a level of operation associated with the vehicle 104 .
- the number of sensors used in a system may be altered to increase or decrease information available to a vehicle control system (e.g., affecting control capabilities of the vehicle 104 ).
- the sensors and systems may be part of one or more advanced driver assistance systems (ADAS) associated with a vehicle 104 .
- ADAS advanced driver assistance systems
- the sensors and systems may be used to provide driving assistance at any level of operation (e.g., from fully-manual to fully-autonomous operations, etc.) as described herein.
- the vehicle 104 includes only optical sensors to lower costs of implementing the system.
- the vehicle 104 may operate at a predetermined, configurable, or static level of autonomy for vehicle driving operations.
- a driver e.g., a human driver
- Level 0 is a no automation level.
- the vehicle may be responsible for a limited number of the driving operations associated with the vehicle, while the driver is still responsible for most driving control operations.
- cruise control may be a Level 1 autonomous level where the vehicle may control the throttle control and/or braking operations.
- the vehicle may collect information (e.g., via one or more driving assistance systems, sensors, etc.) about an environment of the vehicle (e.g., surrounding area, roadway, traffic, ambient conditions, etc.) and use the collected information to control driving operations (e.g., steering, accelerating, braking, etc.) associated with the vehicle.
- driving operations e.g., steering, accelerating, braking, etc.
- the driver may be required to perform some aspects of driving operations not controlled by the vehicle but, like Levels 0 and 1, a driver monitors or controls at least some of the driving operations of the vehicle 104 .
- the vehicle 104 controls driving operations with the driver separated from controlling the driving operations of the vehicle 104 , except when the vehicle 104 requests an operator to act or intervene in some operation. Thus, the driver only conducts operations when required or needed by the vehicle 104 .
- the vehicle 104 conducts driving operations even if a driver ignores or fails to response to a request to intervene. Level 4 may be referred to as a “High Automation” level.
- the vehicle 104 can control all the driving operations associated with the vehicle in all driving modes regardless of what the driver may be doing.
- the vehicle 104 in Level 5, may continually monitor traffic, vehicular, roadway, and/or other conditions in the environment 100 while the vehicle is in operation.
- the vehicle 104 may also be in communication with one or more outside entities, for example, a computing system 122 , another vehicle 128 , etc.
- Communication may be a wireless interconnection 112 as described herein.
- the wireless interconnection 112 can be directly to the other entity, for example, the vehicle 128 , or to a communication antenna 108 (e.g., a cellular tower, a wireless network endpoint, etc.).
- the communication signal may be sent through a network 116 (e.g., a local area network (LAN), a wide area network (WAN), etc.) to the computing system 122 .
- the computing system 122 can be any type of computer, as described herein, including a server or server system.
- FIG. 1 generally shows a vehicle sensing environment 100 at least partially viewed by the sensors disposed in, on, and/or about the vehicle 104 .
- Each sensor may include an operational detection range R and operational detection angle ⁇ .
- the operational detection range R may define the effective detection limit, or distance, of the sensor. In some cases, this effective detection limit may be defined as a distance from a portion of the sensor (e.g., a lens, sensing surface, etc.) to a point in space offset from the sensor.
- the effective detection limit may define a distance, beyond which, the sensing capabilities of the sensor deteriorate, fail to work, or are unreliable.
- the effective detection limit may define a distance, within which, the sensing capabilities of the sensor are able to provide accurate and/or reliable detection information.
- the operational detection angle ⁇ may define at least one angle of a span, between horizontal and/or vertical limits, of a sensor.
- the operational detection limit and the operational detection angle ⁇ of a sensor together may define the effective detection zone (e.g., the effective detection area, and/or volume, etc.) of a sensor.
- the vehicle 104 may include an imaging system to detect visual information in an environment surrounding the vehicle 104 .
- the visual information detected in the environment surrounding the imaging system may be processed (e.g., via one or more sensor and/or system processors, etc.) to generate a complete 360-degree view of an environment 100 around the vehicle 104 .
- the imaging system may be configured to generate changing 360-degree views of the environment 100 in real-time, for instance, as the vehicle 104 drives.
- Sensor data and information may be collected by one or more sensors or systems of the vehicle 104 monitoring the vehicle sensing environment 100 .
- This information may be processed (e.g., via a processor, computer-vision system, etc.) to determine targets (e.g., objects, signs, people, markings, roadways, conditions, etc.) inside one or more detection areas associated with the vehicle sensing environment 100 .
- targets e.g., objects, signs, people, markings, roadways, conditions, etc.
- information from multiple sensors may be processed to form composite sensor detection information.
- a first sensor and a second sensor may correspond to a first camera and a second camera aimed in a forward traveling direction of the vehicle 104 .
- images collected by the two cameras may be combined to form stereo image information.
- This composite information may increase the capabilities of a single sensor in the one or more sensors by, for example, adding the ability to determine depth associated with targets in the one or more detection zones.
- Similar image data may be collected by rear view cameras (e.g., sensors) aimed in a rearward traveling direction of the vehicle 104 .
- multiple sensors may be effectively joined to increase a sensing zone and provide increased sensing coverage.
- multiple sensors disposed on the front 110 of the vehicle 104 may be joined to provide a zone of coverage that spans across an entirety of the front 110 of the vehicle 104 .
- These overlapping detection zones may provide redundant sensing, enhanced sensing, and/or provide greater detail in sensing within a particular portion of a larger zone.
- the sensors of the vehicle 104 may be arranged to create a complete coverage, via one or more sensing zones around the vehicle 104 . In some areas, the sensing zones of two or more sensors may intersect at an overlap zone. In some areas, the angle and/or detection limit of two or more sensing zones may meet at a virtual intersection point or plane.
- the vehicle 104 may include a number of sensors disposed proximal to the rear 120 of the vehicle 104 .
- These sensors can include, but are in no way limited to, an imaging sensor, camera, IR, a radio object-detection and ranging sensors, RADAR, RF, ultrasonic sensors, and/or other object-detection sensors.
- these sensors may detect targets near or approaching the rear of the vehicle 104 .
- another vehicle approaching the rear 120 of the vehicle 104 may be detected by one or more of the ranging and imaging system (e.g., LIDAR), rear-view cameras, and/or rear facing RADAR sensors.
- the images from the rear-view cameras may be processed to generate a stereo view (e.g., providing depth associated with an object or environment, etc.) for targets visible to both cameras
- This sensor arrangements may provide critical sensor information to a vehicle control system in at least one of the autonomous driving levels described above. For instance, when the vehicle 104 is driving autonomously (e.g., Level 3, Level 4, or Level 5) and detects other vehicles stopped in a travel path, the sensor detection information may be sent to the vehicle control system of the vehicle 104 to control a driving operation (e.g., braking, decelerating, etc.) associated with the vehicle 104 (in this example, slowing the vehicle 104 as to avoid colliding with the stopped other vehicles). As yet another example, the vehicle 104 may be operating and one or more of the imaging system, and/or the side-facing sensors, may detect targets at a side 160 of the vehicle 104 .
- a driving operation e.g., braking, decelerating, etc.
- the vehicle 104 may be operating and one or more of the imaging system, and/or the side-facing sensors, may detect targets at a side 160 of the vehicle 104 .
- the sensors may detect a target that is both at a side 160 and a front 110 of the vehicle 104 (e.g., disposed at a diagonal angle to a centerline of the vehicle 104 running from the front 110 of the vehicle 104 to the rear 120 of the vehicle). Additionally or alternatively, the sensors may detect a target that is both, or simultaneously, at a side 160 and a rear 120 of the vehicle 104 (e.g., disposed at a diagonal angle to the centerline of the vehicle 104 ).
- FIG. 2 A is a block diagram of an embodiment of a vehicle computing environment 200 of the vehicle 104 in accordance with embodiments of the present disclosure.
- the computing environment 200 may include one or more vehicle sensors and systems 208 , computing system (processor) 204 , interface 228 , vehicle control systems (e.g., steering system 236 , braking system 240 , acceleration system 244 , environmental control 248 , infotainment system 252 , etc.), a communication system to the vehicle control systems (e.g., a control area network (CAN) bus 232 , a navigation system 212 , one or more data stores (e.g., user information 224 , model(s) 220 , image data 216 , etc.).
- vehicle control systems e.g., steering system 236 , braking system 240 , acceleration system 244 , environmental control 248 , infotainment system 252 , etc.
- a communication system to the vehicle control systems e.g., a control
- associated components may be electrically and/or communicatively coupled to one another via at least one bus or other interconnection.
- the one or more associated components may send and/or receive signals across a communication network to a separate entity, for example, server 122 .
- the computing system 204 can include any hardware and/or software to conduct operations, as described herein, in accordance with embodiments of the present disclosure.
- the computing system 204 may be as described in conjunction with FIGS. 2 C and 2 D .
- Interconnected to the computing system 204 may be one or more data stores 216 - 224 , the sensors 208 , the navigation system, and/or the interface 228 , which are described hereinafter.
- the interface 228 may comprise any type of known communication medium or collection of communication media and may use any type of protocols, such as SIP, TCP/IP, SNA, IPX, AppleTalk, and the like, to transport messages between the computing system 204 and the interface 228 and then translate those messages for transmission onto the CAN bus 232 .
- the interface 228 may include wired and/or wireless communication technologies.
- LAN Local Area Network
- WAN Wide Area Network
- VPN virtual private network
- infra-red network a wireless network (e.g., a network operating under any of the IEEE 802.9 suite of protocols, the Bluetooth® protocol known in the art, and/or any other wireless protocol), and any other type of packet-switched or circuit-switched network known in the art and/or any combination of these and/or other networks.
- the interface 228 need not be limited to any one network type, and instead may be comprised of a number of different networks and/or network types.
- the interface 228 may comprise a number of different communication media such as coaxial cable, copper cable/wire, fiber-optic cable, antennas for transmitting/receiving wireless messages, and combinations thereof.
- the vehicle 104 can include a navigation system 212 (e.g., global positioning system (GPS), etc.) and/or one or more sensors 208 , for example, sensors for orientation, odometer, camera, infrared (IR), and/or other optional sensors, for example LIDAR, RADAR, ultrasonic, and/or other sensor or system.
- GPS global positioning system
- sensors 208 for example, sensors for orientation, odometer, camera, infrared (IR), and/or other optional sensors, for example LIDAR, RADAR, ultrasonic, and/or other sensor or system.
- the camera sensors may include one or more components configured to detect image information associated with an environment of the vehicle 104 .
- the camera sensors may include a lens, filter, image sensor, and/or a digital image processer. It is an aspect of the present disclosure that multiple camera sensors may be used together to generate stereo images providing depth measurements.
- Examples of the camera sensors as described herein may include, but are not limited to, at least one of ON Semiconductor® MT9V024 Global Shutter VGA GS CMOS image sensors, Teledyne DALSA Falcon2 camera sensors, CMOSIS CMV50000 high-speed CMOS image sensors, other industry-equivalent camera sensors and/or systems, and may perform visual target and/or obstacle detection in an environment around the vehicle 104 using any known or future-developed standard and/or architecture.
- the IR sensors may be configured to detect and/or measure a temperature associated with a target (e.g., an object, pedestrian, other vehicle, etc.).
- a target e.g., an object, pedestrian, other vehicle, etc.
- Examples of IR sensors as described herein may include, but are not limited to, at least one of Opto Diode lead-salt IR array sensors, Opto Diode OD-850 Near-IR LED sensors, Opto Diode SA/SHA727 steady state IR emitters and IR detectors, FLIR® LS microbolometer sensors, FLIR® TacFLIR 380-HD InSb MWIR FPA and HD MWIR thermal sensors, FLIR® VOx 640 ⁇ 480 pixel detector sensors, Delphi IR sensors, other industry-equivalent IR sensors and/or systems, and may perform IR visual target and/or obstacle detection in an environment around the vehicle 104 using any known or future-developed standard and/or architecture.
- An orientation sensor may include one or more sensors configured to determine an orientation of the vehicle 104 relative to at least one reference point.
- the orientation sensor may include at least one pressure transducer, stress/strain gauge, accelerometer, gyroscope, and/or geomagnetic sensor.
- Examples of the navigation sensor as described herein may include, but are not limited to, at least one of Bosch Sensortec BMX 160 series low-power absolute orientation sensors, Bosch Sensortec BMX055 9-axis sensors, Bosch Sensortec BMI055 6-axis inertial sensors, Bosch Sensortec BMI160 6-axis inertial sensors, Bosch Sensortec BMF055 9-axis inertial sensors (accelerometer, gyroscope, and magnetometer) with integrated Cortex M0+ microcontroller, Bosch Sensortec BMP280 absolute barometric pressure sensors, Infineon TLV493D-A1B6 3D magnetic sensors, Infineon TLI493D-W1B6 3D magnetic sensors, Infineon TL family of 3D magnetic sensors, Murata Electronics SCC2000 series combined gyro sensor and accelerometer, Murata Electronics SCC1300 series combined gyro sensor and accelerometer, other industry-equivalent orientation sensors and/or systems, and may perform orientation detection and/or determination functions using any known or future-developed standard
- Examples of the odometry sensor/system as described herein may include, but are not limited to, at least one of Infineon TLE4924/26/27/28C high-performance speed sensors, Infineon TL4941plusC(B) single chip differential Hall wheel-speed sensors, Infineon TL5041plusC Giant Mangnetoresistance (GMR) effect sensors, Infineon TL family of magnetic sensors, EPC Model 25SP Accu-CoderPro® incremental shaft encoders, EPC Model 30M compact incremental encoders with advanced magnetic sensing and signal processing technology, EPC Model 925 absolute shaft encoders, EPC Model 958 absolute shaft encoders, EPC Model MA36S/MA63S/SA36S absolute shaft encoders, Dynapar®, F18 commutating optical encoder, Dynapar® HS35R family of phased array encoder sensors, other industry-equivalent odometry sensors and/or systems, and may perform change in position detection and/or determination functions using any known or future-developed standard and/or architecture
- the vehicle 104 can include other optional sensors, which can supplement the visual sensors described above.
- a LIDAR sensor/system may include one or more components configured to measure distances to targets using laser illumination.
- the LIDAR sensor/system may provide 3D imaging data of an environment around the vehicle 104 . The imaging data may be processed to generate a full 360-degree view of the environment around the vehicle 104 .
- the LIDAR sensor/system may include a laser light generator configured to generate a plurality of target illumination laser beams (e.g., laser light channels).
- this plurality of laser beams may be aimed at, or directed to, a rotating reflective surface (e.g., a mirror) and guided outwardly from the LIDAR sensor/system into a measurement environment.
- the rotating reflective surface may be configured to continually rotate 360 degrees about an axis, such that the plurality of laser beams is directed in a full 360-degree range around the vehicle 104 .
- a photodiode receiver of the LIDAR sensor/system may detect when light from the plurality of laser beams emitted into the measurement environment returns (e.g., reflected echo) to the LIDAR sensor/system.
- the LIDAR sensor/system may calculate, based on a time associated with the emission of light to the detected return of light, a distance from the vehicle 104 to the illuminated target. In some embodiments, the LIDAR sensor/system may generate over 2.0 million points per second and have an effective operational range of at least 100 meters.
- LIDAR sensor/system may include, but are not limited to, at least one of Velodyne.R® LiDAR® HDL-64E 64-channel LIDAR sensors, Velodyne® LiDAR® HDL-32E 32-channel LIDAR sensors, Velodyne® LiDAR® PUCK® VLP-16 16-channel LIDAR sensors, Leica Geosystems Pegasus: Two mobile sensor platform, Garmin® LIDAR-Lite v3 measurement sensor, Quanergy M8 LiDAR sensors, Quanergy S3 solid state LiDAR sensor, LeddarTech® LeddarVU compact solid state fixed-beam LIDAR sensors, other industry-equivalent LIDAR sensors and/or systems, and may perform illuminated target and/or obstacle detection in an environment around the vehicle 104 using any known or future-developed standard and/or architecture.
- a RADAR sensors may include one or more radio components that are configured to detect objects/targets in an environment of the vehicle 104 .
- the RADAR sensors may determine a distance, position, and/or movement vector (e.g., angle, speed, etc.) associated with a target over time.
- the RADAR sensors may include a transmitter configured to generate and emit electromagnetic waves (e.g., radio, microwaves, etc.) and a receiver configured to detect returned electromagnetic waves.
- the RADAR sensors may include at least one processor configured to interpret the returned electromagnetic waves and determine locational properties of targets.
- Examples of the RADAR sensors as described herein may include, but are not limited to, at least one of Infineon BASIC® RTN7735PL transmitter and RRN7745PL/46PL receiver sensors, Autoliv ASP Vehicle RADAR sensors, Delphi L2C0051TR 77 GHz ESR Electronically Scanning Radar sensors, Fujitsu Ten Ltd. Automotive Compact 77 GHz 3D Electronic Scan Millimeter Wave Radar sensors, other industry-equivalent RADAR sensors and/or systems, and may perform radio target and/or obstacle detection in an environment around the vehicle 104 using any known or future-developed standard and/or architecture.
- the ultrasonic sensors may include one or more components that are configured to detect objects/targets in an environment of the vehicle 104 .
- the ultrasonic sensors may determine a distance, position, and/or movement vector (e.g., angle, speed, etc.) associated with a target over time.
- the ultrasonic sensors may include an ultrasonic transmitter and receiver, or transceiver, configured to generate and emit ultrasound waves and interpret returned echoes of those waves.
- the ultrasonic sensors may include at least one processor configured to interpret the returned ultrasonic waves and determine locational properties of targets.
- ultrasonic sensors may include, but are not limited to, at least one of Texas Instruments TIDA-00151 automotive ultrasonic sensor interface IC sensors, MaxBotix® MB8450 ultrasonic proximity sensor, MaxBotix® ParkSonar®-EZ ultrasonic proximity sensors, Murata Electronics MA40H1S-R open-structure ultrasonic sensors, Murata Electronics MA40S4R/S open-structure ultrasonic sensors, Murata Electronics MA58MF14-7N waterproof ultrasonic sensors, other industry-equivalent ultrasonic sensors and/or systems, and may perform ultrasonic target and/or obstacle detection in an environment around the vehicle 104 using any known or future-developed standard and/or architecture.
- the driving vehicle sensors may include other sensors and/or combinations of the sensors described above. Additionally or alternatively, one or more of the sensors described above may include one or more processors configured to process and/or interpret signals detected by the one or more sensors. In some embodiments, the processing of at least some sensor information provided by the vehicle sensors and systems 208 may be processed by at least one sensor processor. Raw and/or processed sensor data may be stored in a sensor data memory storage medium. In some embodiments, the sensor data memory may store instructions used by the sensor processor 204 for processing sensor information provided by the sensors and systems 208 .
- control data memory may store models, used by the computing system 204 for controlling driving operations of the vehicle 104 , historical control information, autonomous driving control rules, and the like in a data store 220 .
- the control data memory may be a disk drive, optical storage device, solid-state storage device such as a random access memory (“RAM”) and/or a read-only memory (“ROM”), which can be programmable, flash-updateable, and/or the like.
- the communications interface 256 in addition to well-known componentry (which has been omitted for clarity), can include interconnected elements including one or more of, but not limited to: one or more antennas, an interleaver/deinterleaver, an analog front end (AFE), memory/storage/cache, controller/microprocessor, MAC circuitry, modulator/demodulator, encoder/decoder, a plurality of connectivity managers, GPU, accelerator, a multiplexer/demultiplexer, transmitter, receiver and wireless radio components such as a Wi-Fi PHY/Bluetooth® module, a Wi-Fi/BT MAC module, transmitter and receiver.
- the various elements in the communications interface 256 are connected by one or more links/busses.
- FIG. 2 D illustrates one embodiment of a computer system 268 upon which the servers 122 , 262 , user computers 204 , computing devices, or other systems or components described above may be deployed or executed.
- the computer system 268 is shown comprising hardware elements that may be electrically coupled via a bus 270 .
- the hardware elements may include one or more central processing units (CPUs) 272 ; one or more input devices 274 (e.g., a mouse, a keyboard, etc.); and one or more output devices 276 (e.g., a display device, a printer, etc.).
- the computer system 268 may also include one or more storage devices 278 .
- the computer-readable storage media/reader 280 can further be connected to a computer-readable storage medium, together (and, optionally, in combination with storage device(s) 278 ) comprehensively representing remote, local, fixed, and/or removable storage devices plus storage media for temporarily and/or more permanently containing computer-readable information.
- the communications system 282 may permit data to be exchanged with a network and/or any other computer described above with respect to the computer environments described herein.
- the term “storage medium” may represent one or more devices for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information.
- the information about the current autonomous model residing at the specified fleet vehicle 404 A-B may be an autonomous model itself, an identifier of the autonomous model and/or data indicating one or more parameters of the autonomous model and/or vehicle.
- the model manager 428 may perform image processing, such as pre-processing on the one or more received images from the fleet vehicle 404 A.
- an autonomous vehicle model may include specific filters, or feature detectors, 516 , and utilize a varying number of strides to obtain the feature map 522 .
- the feature detectors 516 , the number of features detectors, and other model parameters may be specific to a fleet vehicle, user, time of day, location etc.
- the resulting feature map 522 may be subsampled, subjected to non-linear processing such as ReLU, and pooled resulting in a rectified feature map 532 for example.
- the process of subsampling and pooling may be performed multiple times.
- the ability to drive autonomously is simplified as only a single object need be tracked for driving in the environment 100 .
- the second vehicle may also be blocking the view of the lanes, and thus, the vehicle may need to follow the second vehicle to continue autonomous driving.
- FIG. 8 shows a method for autonomously driving a vehicle in accordance with embodiments of the present disclosure.
- a general order for the steps of the method 800 is shown in FIG. 8 .
- the method 800 starts with a start operation 804 and ends with an end operation 828 .
- the method 800 can include more or fewer steps or can arrange the order of the steps differently than those shown in FIG. 8 .
- the method 800 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Further, the method 800 can be performed by gates or circuits associated with a processor, an ASIC, a FPGA, a SOC, or other hardware device.
- the method 800 shall be explained with reference to the systems, components, devices, modules, software, signals, data structures, interfaces, methods, etc. described in conjunction with FIGS. 1 - 7 and 9 - 28 .
- the image processing node executed by the processor 204 and implemented in the convolutional neural network as an ML model, may process the images of the driving environment 100 to locate the identified vehicle 2408 .
- different edges, the position of those edges, and other information can be determined.
- Conditions can include an edge or other information, such as movement of an edge over time, location of other sensed data, etc.
- the vehicle 2408 may be identified and followed as described previously in conjunction with FIGS. 22 A- 22 E .
- FIG. 12 shows a method for autonomously driving a vehicle in accordance with embodiments of the present disclosure.
- a general order for the steps of the method 1200 is shown in FIG. 12 .
- the method 1200 starts with a start operation 1204 and ends with an end operation 1232 .
- the method 1200 can include more or fewer steps or can arrange the order of the steps differently than those shown in FIG. 12 .
- the method 1200 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Further, the method 1200 can be performed by gates or circuits associated with a processor, an ASIC, a FPGA, a SOC, or other hardware device.
- the method 1200 shall be explained with reference to the systems, components, devices, modules, software, signals, data structures, interfaces, methods, etc. described in conjunction with FIGS. 1 - 11 and 13 - 28 .
- the lane identification node can analyze the procession of the edge. If the edge makes a change compared to some set benchmark, e.g., more than 10°, that edge section may be marked. If the edge returns to a more uniform line, that end of the section may also be marked. Thus, the portion deviating from a normalized or best fit line can be marked.
- some set benchmark e.g., more than 10°
- the process 1200 can proceed YES to step 1224 . If there is no portion of the edge 2506 to points 2512 that needs to be filtered, the process 1200 can proceed NO to step 1220 .
- the lane identification node processes the unfiltered edges and follows those edges to maintain the vehicle 104 within the lane by issuing commands through the CAN bus 232 to the control systems 236 - 254 to follow the lane.
- the processor 204 can determine one or more conditions of the environment 100 while the vehicle 104 is not moving, in step 1308 .
- the processor 204 can receive one or more images from the image sensors 208 .
- the nodes 304 deploy a ML model executed by the processor 204 and associated with a CNN may process the image. Within the image, different edges, the position of those edges, and other information can be determined. Conditions can include an edge or other information, such as movement of an edge over time, location of other sensed data, etc.
- FIG. 14 shows a method for autonomously driving a vehicle in accordance with embodiments of the present disclosure.
- a general order for the steps of the method 1400 is shown in FIG. 14 .
- the method 1400 starts with a start operation 1404 and ends with an end operation 1432 .
- the method 1400 can include more or fewer steps or can arrange the order of the steps differently than those shown in FIG. 14 .
- the method 1400 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Further, the method 1400 can be performed by gates or circuits associated with a processor, an ASIC, a FPGA, a SOC, or other hardware device.
- the method 1400 shall be explained with reference to the systems, components, devices, modules, software, signals, data structures, interfaces, methods, etc. described in conjunction with FIGS. 1 - 13 and 15 - 28 .
- the processor 204 can detect a change of position of the vehicle 124 , in step 1408 .
- a change in position of the vehicle 124 may be determined by a shift in the position of the vehicle 124 object in one image to a subsequent image. Then, if the vehicle 124 moves, e.g., changes lanes, the vehicle 104 may then determine whether the vehicle 124 remains in view. The vehicle 104 can continue along a route regardless of a vehicle 124 staying on the same route. Thus, in some situations, the vehicle 124 may deviate from the route and disappear from the image generated by the sensors 208 .
- the processor 204 can determine that the vehicle 124 has deviated from the route and no longer is present in the images of the sensor 208 . These changes in position of the vehicle 124 over time may be stored, with metadata explaining what, when, where, and why the information was stored, as image data 216 , in step 1436 .
- the processor 204 can determine if the vehicle 124 is driving erratically. For example, if a number of adjustments per hour exceeds a predetermined benchmark, the processor 204 can determine the driving of the vehicle 124 is erratic. If a number of lane changes or other positional change is more than a predetermined threshold, the processor 204 can determine the vehicle 124 is behaving erratically. If the processor 204 determines that the vehicle is behaving erratically, the process 1400 may proceed Yes to step 1424 . However, if the processor 204 determines that the vehicle is not behaving erratically, the process 1400 may proceed NO to step 1420 , wherein the processor 204 continues to follow the vehicle 124 .
- a parameter change such as but not limited to lane changes, acceleration, deceleration, velocity, etc. may be utilized to determine the condition of the followed vehicle.
- a condition of the followed vehicle may be determined to be erratic, not-safe, safe, not-erratic, etc.
- the processor 204 may coordinate a message to be sent, where the message may be sent to the followed vehicle, a different reporting agency, or otherwise, using the network 260 for example.
- the processor 204 can report the behavior and the vehicle 124 to the authorities or to another entity. Thus, the processor 204 can send an automated message to a third party through the communications interface 256 . Further, the processor 204 may also start following a second vehicle 128 as described above.
- FIG. 15 shows a method for autonomously driving a vehicle in accordance with embodiments of the present disclosure.
- a general order for the steps of the method 1500 is shown in FIG. 15 .
- the method 1500 starts with a start operation 1504 and ends with an end operation 1532 .
- the method 1500 can include more or fewer steps or can arrange the order of the steps differently than those shown in FIG. 15 .
- the method 1500 can be executed as a set of computer-executable instructions executed by a computer system and encoded or stored on a computer readable medium. Further, the method 1500 can be performed by gates or circuits associated with a processor, an ASIC, a FPGA, a SOC, or other hardware device.
- the method 1500 shall be explained with reference to the systems, components, devices, modules, software, signals, data structures, interfaces, methods, etc. described in conjunction with FIGS. 1 - 14 and 16 - 28 .
- the processor 204 can enter the autonomous state to follow an identified vehicle 124 .
- an association but not necessarily a particular distance or alignment to that vehicle 124 may be maintained through controls sent through the CAN bus 232 to the control systems 236 - 254 .
- the processor 204 follows the vehicle 124 along the route by monitoring that vehicle 124 and maintaining an association with that vehicle 124 .
- the association is a relationship with the vehicle 124 , in that, the vehicle 104 follows vehicle 124 but allows some changes in where the vehicle 124 is compared to vehicle 104 .
- the processor 204 can detect a change of position of the vehicle 124 , in step 1508 .
- a change in position of the vehicle 124 may be determined by a shift in the position of the vehicle 124 object in one image to a subsequent image. Then, if the vehicle 124 moves, e.g., changes lanes, the vehicle 104 may then determine whether the vehicle 124 remains in view. The vehicle 104 can continue along a route regardless of a vehicle 124 staying on the same route. Thus, in some situations, the vehicle 124 may deviate from the route and disappear from the image generated by the sensors 208 .
- the processor 204 can determine that the vehicle 124 has deviated from the route and no longer is present in the images of the sensor 208 . These changes in position of the vehicle 124 over time may be stored, with metadata explaining what, when, where, and why the information was stored, as image data 216 , in step 1536 .
- the processor 204 can determine if the vehicle 104 is driving erratically. For example, if a number of adjustments per hour exceeds a predetermined benchmark, the processor 204 can determine the driving of the vehicle 104 is erratic. If a number of lane changes or other positional change is more than a predetermined threshold, the processor 204 can determine the vehicle 104 is behaving erratically. If the processor 204 determines that the vehicle 104 is behaving erratically, the process 1500 may proceed Yes to step 1524 . However, if the processor 204 determines that the vehicle 104 is not behaving erratically, the process 1500 may proceed NO to step 1520 , wherein the processor 204 continues to follow the vehicle 124 .
- the processor 204 can determine problem with the control of the vehicle and, in step 1528 , execute a response for the problem. For example, the processor can determine a necessity for and execute the reporting of the behavior and driver of the vehicle 104 to the authorities or to another entity. Thus, the processor 204 can send an automated message to a third party through the communications interface 256 . Further, the processor 204 may also start following a second vehicle 128 as described above. In still other situations, the processor 204 can indicate a problem with something in the autonomous drive system based on the erratic behavior. In this situations, the processor 204 may instruct the driving systems to pull over the vehicle 104 and wait for repairs or service.
- one or more calibration parameters may be obtained over time; such calibration parameters allow for the calibration of the image acquisition device in an efficient and ongoing manner.
- such calibration can be performed dynamically on the fly and/or over an ongoing basis.
- a calibration may be as shown in FIGS. 26 A- 26 C .
- an image 2600 is provided having one or more objects 2636 a , 2636 b , 2648 a - 2648 c .
- These objects 2636 , 2648 can represent buildings and lane lines or lane markers, respectively, in the image 2600 that can help determined the vanishing point 2632 in the image 2600 .
- the building 2636 can have one or more edges 2640 a , 2640 b , 2640 c , 2640 d that help determine the vanishing point.
- the autonomous vehicle model identifies a stop sign and/or traffic light
- size information from the stop sign and/or traffic light such as the aperture openings, size of LED etc. . . .
- object parameter information may be used as ongoing calibration data for the camera.
- the known feature information may be retrieved from a database at step 1720 and may be utilized to the scale or otherwise update one or more calibration parameters associated with the object identified and/or extracted from the image provided by the image acquisition device.
- the new calibration parameters associated with the image acquisition device and vehicle may then be stored in the database, such as database 444 .
- FIG. 18 depicts a method 1800 for calibrating a first image acquisition device based on information received from a second image acquisition device in accordance with embodiments of the present disclosure.
- the system may initiate method 1800 at step 1804 , where the method may proceed to step 18008 where an image from a first image acquisition device is obtained.
- the method then proceeds to step 1812 where one or more objects are detected, via feature extraction and classification, as explained previously in conjunction with FIGS. 22 A- 22 E .
- information from a second image acquisition device may be obtained; that is, the second image acquisition device may obtain a second image.
- the second image acquisition device may be associated the same vehicle or a different vehicle.
- an image may be obtained from a second vehicle where the second image includes scene information that is the same as or similar to a scene obtained from the first image acquisition device.
- the information obtained from the second image acquisition device may be obtained for a same location as the image obtained from the first image acquisition device.
- the second vehicle can send a data structure 2714 as described in conjunction with FIG. 27 B .
- the vehicle 104 may receive data structure 2714 from a data store 2700 , as shown in FIG. 27 B , and/or from a second image acquisition devise associated with a second vehicle.
- the data structure 2714 can include a vehicle identifier (ID) 2716 , a GPS coordinate 2720 , object information 2724 , and/or a second image 2728 .
- ID vehicle identifier
- Each association of vehicles can have a data structure 2714 , as represented by ellipses 2748 .
- the vehicle ID can be automatically generated and represent any type of identifier including one or more of, but not limited to, a numeric ID, a license plate number, an alphanumeric ID, a globally unique ID (GUID), etc.
- GPS information 2720 can include a location of the second vehicle at a time when the image was acquired. In this way, the first vehicle can obtain a same image from a same location.
- Object information 2724 may be as defined in conjunction with FIG. 28 A .
- Images 2728 can be one or more images taken by the second image acquisition devise for the first image acquisition devise.
- the center of the boxes may be determined for each box as previously described in conjunction with FIGS. 22 A- 22 E .
- the location enumerated by a number of pixels (e.g., as a vector length and angle or as a number of pixels in a horizontal direction 2664 and vertical direction 2668 ) from a vanishing point 2632 .
- the calibration node at the first vehicle can use the calibration information for the second image acquisition device and the location of the second vehicle when the image was acquired to make a comparison to a similar image made by the first image acquisition devise of the first vehicle.
- the comparison can reveal where the vanishing point 2632 should be and how to determine the pitch and yaw of the camera as explained in conjunction with FIG. 26 C .
- calibration information such as one or more previously described parameters, may be retrieved and/or obtained for the second image and/or for the second image acquisition device.
- one or more calibration parameters associated with the first image acquisition device may be modified.
- calibration parameters from a second image acquisition device for an image of the first scene may be obtained and may be utilized to update and/or modify the calibration parameters of the first image acquisition device.
- the method 1800 may end at step 1828 .
- the calibration processes above may occur periodically to ensure the image acquisition devise remains calibrated.
- FIG. 19 A depicts a method utilized to adjust one or more parameters of an autonomous vehicle based on an autonomous vehicle model in accordance with embodiments of the present disclosure.
- the method 1900 may begin at step 1904 where the method proceeds to receive an image from an image acquisition device at step 1908 .
- the image may be preprocessed, as previously described, and one or more features may be extracted and classified at step 1916 .
- features may be classified from the first image using the autonomous vehicle model and any adjustment may be recommended using the same autonomous vehicle model.
- the autonomous vehicle model may provide the recommendation to the steering angle adjust 468 and/or the velocity adjust 472 where a vehicle adjustment may be made.
- the method 1900 may then end at step 1932 .
- the statistical accuracy and/or method may vary over time but may be maintained in such a manner that the statistical accuracy does not drop below a certain threshold. For example, if the statistical accuracy of the model based on vehicle adjust output drops below being 94% accurate at step 1944 , a request for a new model, or an updated node, may be made at step 1948 if such model is available at step 1952 , the new model may be retrieved and/or provided by the autonomous vehicle model generation system 416 automatically.
- Statistically accuracy may include a measure of time for which user intervention was not needed vs a measure of time for which user intervention was needed.
- FIG. 20 depicts one or more methods for receiving and/or requesting a new model in accordance with embodiments of the present disclosure.
- the method 2000 may begin at step 2004 and may proceed to step 2008 , where in a first instance, model input parameters may be evaluated to determine if the input parameters comply with model requirements.
- model input parameters may be evaluated to determine if the input parameters comply with model requirements.
- an input for a location may be provided to a model, for example, autonomous vehicle model 452 .
- a specific autonomous vehicle model 452 may be valid for a specified location. If the input location is outside the valid location, for example at steps 2012 and 2016 , a new model may be requested and/or received.
- such input parameters may apply to various times of day, various real-time weather conditions, various times of year, and/or various changes in a user/vehicle.
- the autonomous vehicle model 452 may be requested and/or otherwise received at the vehicle. In some instances, the autonomous vehicle model 452 may be pushed to the autonomous vehicle. The new autonomous vehicle model 452 may be associated with the approaching region such that the autonomous vehicle includes a geographically relevant autonomous vehicle model.
- an autonomous vehicle model 452 or a portion of the autonomous vehicle model 452 may be updated.
- a specific node (for example 304 C) associated with identifying one or more lane markers may be updated at a first time
- a specific node (for example, 304 I/ 304 J) associated with controlling a steering angle of a vehicle may be updated at a second time.
- both of the previously mentioned nodes may be updated at a same time.
- the update may include a parameter, such as a single threshold; in other instances, an entire node may be replaced with a node included in the update.
- the processor 204 may also continually monitor the rate of speed, rate of deceleration, and/or the rate of acceleration. Thus, based on sensor data from an accelerometer, the acceleration system 244 , deceleration system 240 , the image sensors, etc., the processor 204 can compare a rate of speed, acceleration, and/or deceleration to a predetermined threshold. IF the rate meets and/or crosses the threshold, the processor can take a predetermined action. For example, if the deceleration crosses a threshold, the processor 204 may automatically engage the hazard lights to warn of other vehicles approaching from behind of the increased deceleration. In other embodiments, the action may occur from measuring a change in rate of another vehicle being sensed by the image sensors.
- the various links connecting the elements can be wired or wireless links, or any combination thereof, or any other known or later developed element(s) that is capable of supplying and/or communicating data to and from the connected elements.
- These wired or wireless links can also be secure links and may be capable of communicating encrypted information.
- Transmission media used as links can be any suitable carrier for electrical signals, including coaxial cables, copper wire, and fiber optics, and may take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
- processors e.g., a single or multiple microprocessors
- memory e.g., a single or multiple microprocessors
- nonvolatile storage e.g., a single or multiple microprocessors
- input devices e.g., keyboards, pointing devices, and output devices.
- output devices e.g., a display, keyboards, and the like.
- alternative software implementations including, but not limited to, distributed processing or component/object distributed processing, parallel processing, or virtual machine processing can also be constructed to implement the methods described herein.
- the disclosed methods may be readily implemented in conjunction with software using object or object-oriented software development environments that provide portable source code that can be used on a variety of computer or workstation platforms.
- the disclosed system may be implemented partially or fully in hardware using standard logic circuits or VLSI design. Whether software or hardware is used to implement the systems in accordance with this disclosure is dependent on the speed and/or efficiency requirements of the system, the particular function, and the particular software or hardware systems or microprocessor or microcomputer systems being utilized.
- the present disclosure in various embodiments, configurations, and aspects, includes components, methods, processes, systems and/or apparatus substantially as depicted and described herein, including various embodiments, subcombinations, and subsets thereof. Those of skill in the art will understand how to make and use the systems and methods disclosed herein after understanding the present disclosure.
- the present disclosure in various embodiments, configurations, and aspects, includes providing devices and processes in the absence of items not depicted and/or described herein or in various embodiments, configurations, or aspects hereof, including in the absence of such items as may have been used in previous devices or processes, e.g., for improving performance, achieving ease, and/or reducing cost of implementation.
- a method includes: receiving sensor data associated with a driving environment of a vehicle; determining an object within the driving environment; determining if the object is a second vehicle; if the object is a second vehicle, changing an autonomous driving mode to follow the second vehicle automatically; and if the object is not a second vehicle, changing the autonomous driving mode to follow a lane viewed in the driving environment.
- the sensor data is an image of the driving environment.
- determining the object comprises automatically drawing a box over the object in the image.
- At least one aspect of the above method includes where determining the object comprises executing a machine learning model that identifies the object in the box as a vehicle. At least one aspect of the above method includes where the box delineates the extents of the object within the image. At least one aspect of the above method includes where a center of the box is determined. At least one aspect of the above method includes where to follow the second vehicle, a position of the center of the box is maintained in subsequent images. At least one aspect of the above method includes where the second vehicle is in a caravan with the vehicle. At least one aspect of the above method includes where determining if the object is a second vehicle comprises identifying the second vehicle as part of the caravan. At least one aspect of the above method includes where the second vehicle is identified by indicia on the vehicle.
- a system including a memory and a processor in communication with the memory, wherein the processor executes instructions stored in the memory, which cause the processor to execute the above method, at least one aspect of the above method, or combinations thereof.
- a non-transitory computer readable medium is provided, where the non-transitory computer readable medium includes instructions stored thereon, which when executed by a processor cause the processor to execute the above method, at least one aspect of the above method, or combinations thereof.
- a method includes: entering, by a first vehicle, an autonomous driving mode that follows a second vehicle; determining a change in a second position of the second vehicle while the first vehicle is following the second vehicle; determining if the change in the second position requires adjusting a first position of the first vehicle; if the change in the second position requires adjusting the first position of the vehicle, changing automatically the first position of the first vehicle to match the change in the second position; and if the change in the second position does not require adjusting the first position of the vehicle, ignoring the change in the second position.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Data Mining & Analysis (AREA)
- Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Atmospheric Sciences (AREA)
- Medical Informatics (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Traffic Control Systems (AREA)
- Business, Economics & Management (AREA)
- Game Theory and Decision Science (AREA)
Abstract
Description
Claims (20)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/796,334 US12025981B2 (en) | 2018-07-13 | 2020-02-20 | System and method for automatically detecting erratic behaviour of another vehicle with a vehicle's autonomous driving system |
| US18/392,733 US20240142977A1 (en) | 2018-07-13 | 2023-12-21 | System and method for updating an autonomous vehicle driving model based on the vehicle driving model becoming statistically incorrect |
Applications Claiming Priority (17)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201862697952P | 2018-07-13 | 2018-07-13 | |
| US201862697946P | 2018-07-13 | 2018-07-13 | |
| US201862697912P | 2018-07-13 | 2018-07-13 | |
| US201862697930P | 2018-07-13 | 2018-07-13 | |
| US201862697940P | 2018-07-13 | 2018-07-13 | |
| US201862697960P | 2018-07-13 | 2018-07-13 | |
| US201862697938P | 2018-07-13 | 2018-07-13 | |
| US201862697962P | 2018-07-13 | 2018-07-13 | |
| US201862697915P | 2018-07-13 | 2018-07-13 | |
| US201862697965P | 2018-07-13 | 2018-07-13 | |
| US201862697922P | 2018-07-13 | 2018-07-13 | |
| US201862697957P | 2018-07-13 | 2018-07-13 | |
| US201862697919P | 2018-07-13 | 2018-07-13 | |
| US201862697969P | 2018-07-13 | 2018-07-13 | |
| US201862697971P | 2018-07-13 | 2018-07-13 | |
| US16/511,968 US11994861B2 (en) | 2018-07-13 | 2019-07-15 | System and method for determining a vehicle's autonomous driving mode from a plurality of autonomous modes |
| US16/796,334 US12025981B2 (en) | 2018-07-13 | 2020-02-20 | System and method for automatically detecting erratic behaviour of another vehicle with a vehicle's autonomous driving system |
Related Parent Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/511,968 Continuation US11994861B2 (en) | 2018-07-13 | 2019-07-15 | System and method for determining a vehicle's autonomous driving mode from a plurality of autonomous modes |
| US16/796,730 Continuation US20200192376A1 (en) | 2018-07-13 | 2020-02-20 | System and method for automatically determining to follow a vehicle in a vehicle's autonomous driving mode |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/796,306 Continuation US20200192375A1 (en) | 2018-07-13 | 2020-02-20 | System and method for updating a ros node in a convolutional neural network |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20210018917A1 US20210018917A1 (en) | 2021-01-21 |
| US12025981B2 true US12025981B2 (en) | 2024-07-02 |
Family
ID=69142890
Family Applications (19)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/511,968 Active 2040-11-13 US11994861B2 (en) | 2018-07-13 | 2019-07-15 | System and method for determining a vehicle's autonomous driving mode from a plurality of autonomous modes |
| US16/796,334 Active 2039-09-21 US12025981B2 (en) | 2018-07-13 | 2020-02-20 | System and method for automatically detecting erratic behaviour of another vehicle with a vehicle's autonomous driving system |
| US16/796,349 Abandoned US20210018918A1 (en) | 2018-07-13 | 2020-02-20 | System and method for improving an autonomous vehicle driving model based on driver engagement conditions |
| US16/796,785 Active 2042-08-22 US12093040B2 (en) | 2018-07-13 | 2020-02-20 | System and method for calibrating an autonomous vehicle camera |
| US16/796,243 Abandoned US20200192372A1 (en) | 2018-07-13 | 2020-02-20 | System and method for updating an autonomous vehicle driving model based on a change in time and/or location |
| US16/796,306 Abandoned US20200192375A1 (en) | 2018-07-13 | 2020-02-20 | System and method for updating a ros node in a convolutional neural network |
| US16/796,319 Pending US20200183395A1 (en) | 2018-07-13 | 2020-02-20 | System and method for automatically identifying an issue from sensor data from an autonomous drive system while the vehicle is static |
| US16/796,794 Active 2040-05-22 US12298764B2 (en) | 2018-07-13 | 2020-02-20 | System and method for calibrating camera data using a second image sensor from a second vehicle |
| US16/796,289 Active 2039-12-19 US11573569B2 (en) | 2018-07-13 | 2020-02-20 | System and method for updating an autonomous vehicle driving model based on the vehicle driving model becoming statistically incorrect |
| US16/796,754 Abandoned US20200201329A1 (en) | 2018-07-13 | 2020-02-20 | System and method for automatically following a vehicle in a vehicle's autonomous driving mode based on received instructions |
| US16/796,765 Abandoned US20200192378A1 (en) | 2018-07-13 | 2020-02-20 | System and method for automatically switching a vehicle to follow in a vehicle's autonomous driving mode |
| US16/796,775 Abandoned US20200192379A1 (en) | 2018-07-13 | 2020-02-20 | System and method for automatically following a lane while in a vehicle's autonomous driving mode |
| US16/796,264 Active 2040-01-14 US11422556B2 (en) | 2018-07-13 | 2020-02-20 | System and method for detecting a condition prompting an update to an autonomous vehicle driving model |
| US16/796,730 Abandoned US20200192376A1 (en) | 2018-07-13 | 2020-02-20 | System and method for automatically determining to follow a vehicle in a vehicle's autonomous driving mode |
| US16/796,744 Active 2041-04-03 US12093039B2 (en) | 2018-07-13 | 2020-02-20 | System and method for automatically determining to follow a divergent vehicle in a vehicle's autonomous driving mode |
| US17/821,346 Pending US20230084316A1 (en) | 2018-07-13 | 2022-08-22 | System and method for detecting a condition prompting an update to an autonomous vehicle driving model |
| US18/147,632 Abandoned US20230259131A1 (en) | 2018-07-13 | 2022-12-28 | System and method for updating an autonomous vehicle driving model based on the vehicle driving model becoming statistically incorrect |
| US18/513,240 Pending US20240085909A1 (en) | 2018-07-13 | 2023-11-17 | System and method for updating an autonomous vehicle driving model based on the vehicle driving model becoming statistically incorrect |
| US18/392,733 Pending US20240142977A1 (en) | 2018-07-13 | 2023-12-21 | System and method for updating an autonomous vehicle driving model based on the vehicle driving model becoming statistically incorrect |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/511,968 Active 2040-11-13 US11994861B2 (en) | 2018-07-13 | 2019-07-15 | System and method for determining a vehicle's autonomous driving mode from a plurality of autonomous modes |
Family Applications After (17)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/796,349 Abandoned US20210018918A1 (en) | 2018-07-13 | 2020-02-20 | System and method for improving an autonomous vehicle driving model based on driver engagement conditions |
| US16/796,785 Active 2042-08-22 US12093040B2 (en) | 2018-07-13 | 2020-02-20 | System and method for calibrating an autonomous vehicle camera |
| US16/796,243 Abandoned US20200192372A1 (en) | 2018-07-13 | 2020-02-20 | System and method for updating an autonomous vehicle driving model based on a change in time and/or location |
| US16/796,306 Abandoned US20200192375A1 (en) | 2018-07-13 | 2020-02-20 | System and method for updating a ros node in a convolutional neural network |
| US16/796,319 Pending US20200183395A1 (en) | 2018-07-13 | 2020-02-20 | System and method for automatically identifying an issue from sensor data from an autonomous drive system while the vehicle is static |
| US16/796,794 Active 2040-05-22 US12298764B2 (en) | 2018-07-13 | 2020-02-20 | System and method for calibrating camera data using a second image sensor from a second vehicle |
| US16/796,289 Active 2039-12-19 US11573569B2 (en) | 2018-07-13 | 2020-02-20 | System and method for updating an autonomous vehicle driving model based on the vehicle driving model becoming statistically incorrect |
| US16/796,754 Abandoned US20200201329A1 (en) | 2018-07-13 | 2020-02-20 | System and method for automatically following a vehicle in a vehicle's autonomous driving mode based on received instructions |
| US16/796,765 Abandoned US20200192378A1 (en) | 2018-07-13 | 2020-02-20 | System and method for automatically switching a vehicle to follow in a vehicle's autonomous driving mode |
| US16/796,775 Abandoned US20200192379A1 (en) | 2018-07-13 | 2020-02-20 | System and method for automatically following a lane while in a vehicle's autonomous driving mode |
| US16/796,264 Active 2040-01-14 US11422556B2 (en) | 2018-07-13 | 2020-02-20 | System and method for detecting a condition prompting an update to an autonomous vehicle driving model |
| US16/796,730 Abandoned US20200192376A1 (en) | 2018-07-13 | 2020-02-20 | System and method for automatically determining to follow a vehicle in a vehicle's autonomous driving mode |
| US16/796,744 Active 2041-04-03 US12093039B2 (en) | 2018-07-13 | 2020-02-20 | System and method for automatically determining to follow a divergent vehicle in a vehicle's autonomous driving mode |
| US17/821,346 Pending US20230084316A1 (en) | 2018-07-13 | 2022-08-22 | System and method for detecting a condition prompting an update to an autonomous vehicle driving model |
| US18/147,632 Abandoned US20230259131A1 (en) | 2018-07-13 | 2022-12-28 | System and method for updating an autonomous vehicle driving model based on the vehicle driving model becoming statistically incorrect |
| US18/513,240 Pending US20240085909A1 (en) | 2018-07-13 | 2023-11-17 | System and method for updating an autonomous vehicle driving model based on the vehicle driving model becoming statistically incorrect |
| US18/392,733 Pending US20240142977A1 (en) | 2018-07-13 | 2023-12-21 | System and method for updating an autonomous vehicle driving model based on the vehicle driving model becoming statistically incorrect |
Country Status (2)
| Country | Link |
|---|---|
| US (19) | US11994861B2 (en) |
| WO (1) | WO2020014683A1 (en) |
Families Citing this family (66)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11151883B2 (en) | 2017-11-03 | 2021-10-19 | International Business Machines Corporation | Empathic autonomous vehicle |
| US20190185012A1 (en) | 2017-12-18 | 2019-06-20 | PlusAI Corp | Method and system for personalized motion planning in autonomous driving vehicles |
| US11130497B2 (en) * | 2017-12-18 | 2021-09-28 | Plusai Limited | Method and system for ensemble vehicle control prediction in autonomous driving vehicles |
| US11273836B2 (en) | 2017-12-18 | 2022-03-15 | Plusai, Inc. | Method and system for human-like driving lane planning in autonomous driving vehicles |
| WO2020014683A1 (en) | 2018-07-13 | 2020-01-16 | Kache.AI | Systems and methods for autonomous object detection and vehicle following |
| FR3084631B1 (en) * | 2018-07-31 | 2021-01-08 | Valeo Schalter & Sensoren Gmbh | DRIVING ASSISTANCE FOR THE LONGITUDINAL AND / OR SIDE CHECKS OF A MOTOR VEHICLE |
| US11861458B2 (en) * | 2018-08-21 | 2024-01-02 | Lyft, Inc. | Systems and methods for detecting and recording anomalous vehicle events |
| CN109345589A (en) * | 2018-09-11 | 2019-02-15 | 百度在线网络技术(北京)有限公司 | Location detection method, device, device and medium based on autonomous vehicle |
| US11040714B2 (en) * | 2018-09-28 | 2021-06-22 | Intel Corporation | Vehicle controller and method for controlling a vehicle |
| WO2020080045A1 (en) * | 2018-10-15 | 2020-04-23 | パナソニックIpマネジメント株式会社 | Object recognition method, vehicle control method, information display method, and object recognition device |
| US11079593B2 (en) * | 2018-11-26 | 2021-08-03 | International Business Machines Corporation | Heads up display system |
| KR102135443B1 (en) * | 2019-01-28 | 2020-07-17 | 네이버랩스 주식회사 | Method for position estimation of vehicle based on graph structure and vehicld using the same |
| JP7261022B2 (en) * | 2019-01-30 | 2023-04-19 | キヤノン株式会社 | Information processing system, terminal device and its control method, program, storage medium |
| US10728461B1 (en) * | 2019-01-31 | 2020-07-28 | StradVision, Inc. | Method for correcting misalignment of camera by selectively using information generated by itself and information generated by other entities and device using the same |
| US12326728B2 (en) * | 2019-03-22 | 2025-06-10 | Qualcomm Incorporated | System and method to control the velocity and heading of a vehicle based on preview information |
| EP3951429A4 (en) * | 2019-03-28 | 2022-06-01 | Sony Semiconductor Solutions Corporation | SIGNAL PROCESSING DEVICE, SIGNAL PROCESSING METHOD, PROGRAM AND INFORMATION PROCESSING DEVICE |
| US11227167B2 (en) | 2019-06-28 | 2022-01-18 | Baidu Usa Llc | Determining vanishing points based on lane lines |
| US11120566B2 (en) * | 2019-06-28 | 2021-09-14 | Baidu Usa Llc | Determining vanishing points based on feature maps |
| US11022972B2 (en) * | 2019-07-31 | 2021-06-01 | Bell Textron Inc. | Navigation system with camera assist |
| JP7301736B2 (en) * | 2019-12-27 | 2023-07-03 | 株式会社クボタ | work vehicle |
| JP7328905B2 (en) * | 2020-01-17 | 2023-08-17 | 株式会社日立製作所 | SENSING SYSTEM AND SENSING CONTROL METHOD |
| US20210258751A1 (en) * | 2020-02-18 | 2021-08-19 | Lenovo (Singapore) Pte. Ltd. | Responding to a signal indicating that an autonomous driving feature has been overridden by alerting plural vehicles |
| CN111368784B (en) * | 2020-03-16 | 2024-04-02 | 广州文远知行科技有限公司 | Target identification method, device, computer equipment and storage medium |
| KR20210121657A (en) * | 2020-03-31 | 2021-10-08 | 현대자동차주식회사 | System and method for estimating load input from road surface |
| JP7076492B2 (en) * | 2020-03-31 | 2022-05-27 | 本田技研工業株式会社 | Vehicle control device and vehicle control method |
| US11080544B1 (en) * | 2020-04-24 | 2021-08-03 | StradVision, Inc. | Method and device for calibrating pitch of camera on vehicle and method and device for continual learning of vanishing point estimation model to be used for calibrating the pitch |
| JP7380409B2 (en) * | 2020-04-29 | 2023-11-15 | 株式会社デンソー | Vehicle recording device, information recording method |
| CN111399546A (en) * | 2020-05-08 | 2020-07-10 | 武汉远翔新能源科技有限责任公司 | Device for realizing wired following between people and vehicle and control method |
| DE102020206355A1 (en) * | 2020-05-20 | 2021-11-25 | BSH Hausgeräte GmbH | Creating a map of the area |
| US11623639B2 (en) * | 2020-07-15 | 2023-04-11 | Charter Communications Operating, Llc | Video analytics traffic monitoring and control |
| DE102020210700A1 (en) * | 2020-08-24 | 2022-02-24 | Robert Bosch Gesellschaft mit beschränkter Haftung | More flexible iterative operation of artificial neural networks |
| US11960025B2 (en) * | 2020-10-22 | 2024-04-16 | Micron Technology, Inc. | Intelligent radar having deep learning accelerator and random access memory |
| US20220137641A1 (en) * | 2020-11-04 | 2022-05-05 | Here Global B.V. | Method, apparatus, and computer program product for generating an autonomous driving profile map index |
| CN112435482B (en) * | 2020-11-06 | 2022-06-14 | 福瑞泰克智能系统有限公司 | Parking control method and system in automatic driving mode, electronic equipment and vehicle |
| CN112036389B (en) * | 2020-11-09 | 2021-02-02 | 天津天瞳威势电子科技有限公司 | Vehicle three-dimensional information detection method, device, device and readable storage medium |
| DE102020215780B4 (en) * | 2020-12-14 | 2024-11-07 | Continental Autonomous Mobility Germany GmbH | Method for selecting an automated driving process using a driver assistance system |
| CN112579464A (en) * | 2020-12-25 | 2021-03-30 | 北京百度网讯科技有限公司 | Verification method, device and equipment of automatic driving algorithm and storage medium |
| US11733369B2 (en) * | 2021-02-11 | 2023-08-22 | Waymo Llc | Methods and systems for three dimensional object detection and localization |
| US12384410B2 (en) | 2021-03-05 | 2025-08-12 | The Research Foundation For The State University Of New York | Task-motion planning for safe and efficient urban driving |
| US20220405573A1 (en) * | 2021-06-18 | 2022-12-22 | Ford Global Technologies, Llc | Calibration for a distributed system |
| DE102021209507B4 (en) | 2021-07-07 | 2025-11-13 | Volkswagen Aktiengesellschaft | Method for adjusting a vehicle's detection unit during a vehicle's movement, as well as camera system and vehicle |
| KR20230014344A (en) * | 2021-07-21 | 2023-01-30 | 현대모비스 주식회사 | Apparatus amd method for monitoring surrounding environment of vehicle |
| US20240273762A1 (en) * | 2021-08-06 | 2024-08-15 | Obshchestvo S Ogranichennoi Otvetstvennostiu "Evokargo" | Method for calibrating external parameters of video cameras |
| US20230045550A1 (en) * | 2021-08-09 | 2023-02-09 | University Of South Florida | Automated vehicle identification based on car-following data with machine learning |
| US20240255950A1 (en) * | 2021-08-19 | 2024-08-01 | Merlin Labs, Inc. | Advanced flight processing system and/or method |
| US12315197B2 (en) | 2021-08-23 | 2025-05-27 | Volkswagen Group of America Investments, LLC | Systems and methods for validating camera calibration in real-time |
| CN113442856B (en) * | 2021-08-31 | 2021-12-10 | 国汽智控(北京)科技有限公司 | Control method and device based on adaptive platform and ROS2 and storage medium |
| US11724720B2 (en) * | 2021-09-28 | 2023-08-15 | GM Global Technology Operations LLC | Unsupervised velocity prediction and correction for urban driving entities from sequence of noisy position estimates |
| AU2021468333A1 (en) | 2021-10-15 | 2024-04-18 | Volvo Truck Corporation | A control unit and method therein for operating an autonomous vehicle |
| US20230169630A1 (en) * | 2021-12-01 | 2023-06-01 | Ford Global Technologies, Llc | Image compensation service |
| CN114326443B (en) * | 2022-01-14 | 2023-09-29 | 重庆长安汽车股份有限公司 | MIL simulation test method and system for ADAS and readable storage medium |
| US20250083701A1 (en) * | 2022-01-26 | 2025-03-13 | Mitsubishi Electric Corporation | Action planning apparatus and action planning method |
| CN114212079B (en) * | 2022-02-18 | 2022-05-20 | 国汽智控(北京)科技有限公司 | ACC-based vehicle control method, device and system |
| AU2023226512A1 (en) | 2022-02-23 | 2024-08-29 | Bright Peak Therapeutics Ag | Immune antigen specific il-18 immunocytokines and uses thereof |
| DE102022201916A1 (en) | 2022-02-24 | 2023-08-24 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for operating a sensor system |
| US20230349702A1 (en) * | 2022-04-27 | 2023-11-02 | Ford Global Technologies, Llc | Systems and methods to follow a vehicle using alternative forms of navigation assistance |
| US12024172B2 (en) | 2022-05-10 | 2024-07-02 | Lenovo (Singapore) Pte. Ltd. | Lane assist for deviating from preexisting traffic lane |
| US20230376801A1 (en) * | 2022-05-17 | 2023-11-23 | Ford Global Technologies, Llc | Monitoring machine learning models in resource constrained settings |
| CN114973180B (en) * | 2022-07-18 | 2022-11-01 | 福思(杭州)智能科技有限公司 | Lane line tracking method, device, equipment and storage medium |
| KR102754372B1 (en) * | 2022-08-10 | 2025-01-14 | 주식회사 자이트론 | Automatic Driving Vehicle and Control System thereof |
| JP2024168092A (en) * | 2023-05-23 | 2024-12-05 | トヨタ自動車株式会社 | Method and management system for managing automatic driving functions in vehicles |
| US12482347B2 (en) | 2023-05-23 | 2025-11-25 | Ford Global Technologies, Llc | Digital geospatial breadcrumbs |
| JP2025002291A (en) * | 2023-06-22 | 2025-01-09 | トヨタ自動車株式会社 | Update verification method, update verification system, and update verification program |
| WO2025041102A1 (en) | 2023-08-23 | 2025-02-27 | Bright Peak Therapeutics Ag | Targeted immune activation with il-18 immunocytokines |
| US20250197467A1 (en) | 2023-08-23 | 2025-06-19 | Bright Peak Therapeutics Ag | Activatable il-18 immunocytokines and uses thereof |
| WO2025041097A1 (en) | 2023-08-23 | 2025-02-27 | Bright Peak Therapeutics Ag | Il-18 polypeptides fused to immune cell antigen specific binding polypeptides and uses thereof |
Citations (38)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20010095460A (en) | 2000-03-30 | 2001-11-07 | 한민홍 | The method and device of vehicle control to follow the rear of the previous car with a fixed distance at low speed. |
| US20130073138A1 (en) | 2010-05-19 | 2013-03-21 | Glenn Michael Callow | System validation |
| US20140324268A1 (en) | 2010-10-05 | 2014-10-30 | Google Inc. | Zone driving |
| US20150081156A1 (en) | 2008-10-24 | 2015-03-19 | Gray & Company, Inc. | Control and systems for autonomously driven vehicles |
| US20150105961A1 (en) | 2011-09-15 | 2015-04-16 | Bae Systems Plc | Autonomous vehicle and task modelling |
| US20150178578A1 (en) * | 2013-12-19 | 2015-06-25 | Harman International Industries, Incorporated | Vehicle behavior analysis |
| US20160068158A1 (en) | 2014-09-10 | 2016-03-10 | Ford Global Technologies, Llc | Automatic park and reminder system and method of use |
| US20160068156A1 (en) | 2014-09-10 | 2016-03-10 | Volkswagen Ag | Modifying autonomous vehicle driving by recognizing vehicle characteristics |
| US9524648B1 (en) * | 2014-11-17 | 2016-12-20 | Amazon Technologies, Inc. | Countermeasures for threats to an uncrewed autonomous vehicle |
| US20170267237A1 (en) | 2016-03-17 | 2017-09-21 | Fuji Jukogyo Kabushiki Kaisha | Vehicle travel control device |
| US20180060555A1 (en) | 2016-08-24 | 2018-03-01 | Bragi GmbH | Digital signature using phonometry and compiled biometric data system and method |
| US20180096541A1 (en) | 2016-10-04 | 2018-04-05 | Wal-Mart Stores, Inc. | System and Methods for Drone-Based Vehicle Status Determination |
| US20180107215A1 (en) | 2016-10-17 | 2018-04-19 | Uber Technologies, Inc. | Neural network system for autonomous vehicle control |
| US20180188742A1 (en) | 2016-12-30 | 2018-07-05 | DeepMap Inc. | Representing navigable surface boundaries of lanes in high definition maps for autonomous vehicles |
| US20180194286A1 (en) | 2017-01-12 | 2018-07-12 | Mobileye Vision Technologies Ltd. | Determining a road surface characteristic |
| US20180237012A1 (en) * | 2017-02-22 | 2018-08-23 | Ford Global Technologies, Llc | Autonomous vehicle towing |
| US20180348763A1 (en) | 2017-06-02 | 2018-12-06 | Baidu Usa Llc | Utilizing rule-based and model-based decision systems for autonomous driving control |
| US20180349782A1 (en) | 2017-06-06 | 2018-12-06 | PlusAI Corp | Method and system for close loop perception in autonomous driving vehicles |
| US20190025843A1 (en) | 2017-07-18 | 2019-01-24 | Uber Technologies, Inc. | Systems and Methods for Speed Limit Context Awareness |
| US20190088135A1 (en) * | 2017-09-15 | 2019-03-21 | Qualcomm Incorporated | System and method for relative positioning based safe autonomous driving |
| US20190086914A1 (en) * | 2017-09-15 | 2019-03-21 | GM Global Technology Operations LLC | Systems and methods for collaboration between autonomous vehicles |
| US20190147255A1 (en) | 2017-11-15 | 2019-05-16 | Uber Technologies, Inc. | Systems and Methods for Generating Sparse Geographic Data for Autonomous Vehicles |
| US20190206260A1 (en) | 2017-12-28 | 2019-07-04 | Bendix Commercial Vehicle Systems Llc | Initialization and safety maintenance strategy for platooning vehicles |
| US20190258251A1 (en) | 2017-11-10 | 2019-08-22 | Nvidia Corporation | Systems and methods for safe and reliable autonomous vehicles |
| US20190329768A1 (en) | 2017-01-12 | 2019-10-31 | Mobileye Vision Technologies Ltd. | Navigation Based on Detected Size of Occlusion Zones |
| US20190369637A1 (en) | 2017-03-20 | 2019-12-05 | Mobileye Vision Technologies Ltd. | Trajectory selection for an autonomous vehicle |
| US20190377354A1 (en) | 2017-03-01 | 2019-12-12 | Mobileye Vision Technologies Ltd. | Systems and methods for navigating with sensing uncertainty |
| US20190384303A1 (en) | 2018-06-19 | 2019-12-19 | Nvidia Corporation | Behavior-guided path planning in autonomous machine applications |
| US20200019165A1 (en) | 2018-07-13 | 2020-01-16 | Kache.AI | System and method for determining a vehicles autonomous driving mode from a plurality of autonomous modes |
| US10564639B1 (en) | 2012-02-06 | 2020-02-18 | Waymo Llc | System and method for predicting behaviors of detected objects through environment representation |
| US20200223451A1 (en) | 2017-10-30 | 2020-07-16 | Mobileye Vision Technologies Ltd. | Navigation based on sensed looking direction of a pedestrian |
| US20200348672A1 (en) | 2017-06-14 | 2020-11-05 | Mobileye Vision Technologies Ltd. | Safety and comfort constraints for navigation |
| US20210034068A1 (en) | 2016-01-05 | 2021-02-04 | Mobileye Vision Technologies Ltd. | Prioritized constraints for a navigational system |
| US20210056327A1 (en) | 2016-06-27 | 2021-02-25 | Mobileye Vision Technologies Ltd. | Controlling host vehicle based on detected door opening events |
| US20210072765A1 (en) | 2016-10-31 | 2021-03-11 | Mobileye Vision Technologies Ltd. | Systems and methods for navigating lane merges and lane splits |
| US20210110483A1 (en) | 2016-12-23 | 2021-04-15 | Mobileye Vision Technologies Ltd. | Navigation Based on Liability Constraints |
| US20210179096A1 (en) | 2018-03-20 | 2021-06-17 | Mobileye Vision Technologies Ltd. | Systems and methods for navigating a vehicle |
| US20210271254A1 (en) | 2018-01-07 | 2021-09-02 | Nvidia Corporation | Guiding vehicles through vehicle maneuvers using machine learning models |
Family Cites Families (73)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5638116A (en) * | 1993-09-08 | 1997-06-10 | Sumitomo Electric Industries, Ltd. | Object recognition apparatus and method |
| US6640164B1 (en) * | 2001-08-28 | 2003-10-28 | Itt Manufacturing Enterprises, Inc. | Methods and systems for remote control of self-propelled vehicles |
| JP3870924B2 (en) * | 2003-04-04 | 2007-01-24 | 日産自動車株式会社 | Information providing apparatus, information providing system, and information providing program |
| FR2874300B1 (en) * | 2004-08-11 | 2006-11-24 | Renault Sas | AUTOMATIC CALIBRATION METHOD OF A STEREOVISION SYSTEM |
| JP4222411B2 (en) * | 2006-11-08 | 2009-02-12 | 日本電気株式会社 | Vanishing point detection system, vanishing point detection method, and vanishing point detection program |
| US8120644B2 (en) * | 2009-02-17 | 2012-02-21 | Autoliv Asp, Inc. | Method and system for the dynamic calibration of stereovision cameras |
| KR101163446B1 (en) * | 2009-03-18 | 2012-07-18 | 기아자동차주식회사 | A lane departure warning system using a virtual lane and a system according to the same |
| US8352111B2 (en) * | 2009-04-06 | 2013-01-08 | GM Global Technology Operations LLC | Platoon vehicle management |
| CN101894366B (en) * | 2009-05-21 | 2014-01-29 | 北京中星微电子有限公司 | Method and device for acquiring calibration parameters and video monitoring system |
| JP5868586B2 (en) * | 2009-11-04 | 2016-02-24 | ヴァレオ・シャルター・ウント・ゼンゾーレン・ゲーエムベーハー | Road characteristic analysis based on video image, lane detection, and lane departure prevention method and apparatus |
| US20110221901A1 (en) * | 2010-03-11 | 2011-09-15 | Gm Global Technology Operations, Inc. | Adaptive Scene Rendering and V2X Video/Image Sharing |
| US9834153B2 (en) * | 2011-04-25 | 2017-12-05 | Magna Electronics Inc. | Method and system for dynamically calibrating vehicular cameras |
| US10520581B2 (en) * | 2011-07-06 | 2019-12-31 | Peloton Technology, Inc. | Sensor fusion for autonomous or partially autonomous vehicle control |
| EP2629243A1 (en) * | 2012-02-15 | 2013-08-21 | Delphi Technologies, Inc. | Method for detecting and tracking lane markings |
| GB2516405B (en) * | 2012-04-27 | 2016-06-15 | Adobe Systems Inc | Automatic adjustment of images |
| JP5926645B2 (en) * | 2012-08-03 | 2016-05-25 | クラリオン株式会社 | Camera parameter calculation device, navigation system, and camera parameter calculation method |
| US9135705B2 (en) * | 2012-10-16 | 2015-09-15 | Qualcomm Incorporated | Sensor calibration and position estimation based on vanishing point determination |
| US9092986B2 (en) * | 2013-02-04 | 2015-07-28 | Magna Electronics Inc. | Vehicular vision system |
| CN105229419A (en) * | 2013-05-22 | 2016-01-06 | 三菱电机株式会社 | Guider |
| US9129161B2 (en) * | 2013-05-31 | 2015-09-08 | Toyota Jidosha Kabushiki Kaisha | Computationally efficient scene classification |
| JP2015087969A (en) * | 2013-10-31 | 2015-05-07 | 三菱電機株式会社 | Column traveling device and traffic accident prevention device |
| TWI503560B (en) * | 2013-12-25 | 2015-10-11 | 財團法人工業技術研究院 | Vehicle position calibration method and apparatus |
| US9251598B2 (en) * | 2014-04-10 | 2016-02-02 | GM Global Technology Operations LLC | Vision-based multi-camera factory monitoring with dynamic integrity scoring |
| JP6326624B2 (en) * | 2014-04-02 | 2018-05-23 | パナソニックIpマネジメント株式会社 | CALIBRATION DEVICE, CALIBRATION METHOD, CAMERA HAVING CALIBRATION FUNCTION, AND PROGRAM |
| US9919717B2 (en) * | 2014-04-14 | 2018-03-20 | Mitsubishi Electric Corporation | Driving assistance device and driving assistance method |
| US9446791B2 (en) * | 2014-05-09 | 2016-09-20 | Raven Industries, Inc. | Refined row guidance parameterization with Hough transform |
| WO2016013996A1 (en) * | 2014-07-25 | 2016-01-28 | Okan Üni̇versitesi̇ | A close range vehicle following system which can provide vehicle distances and course by using various variables. |
| US9918001B2 (en) * | 2014-08-21 | 2018-03-13 | Toyota Motor Sales, U.S.A., Inc. | Crowd sourcing exterior vehicle images of traffic conditions |
| US9558659B1 (en) * | 2014-08-29 | 2017-01-31 | Google Inc. | Determining the stationary state of detected vehicles |
| DE102015201555A1 (en) * | 2015-01-29 | 2016-08-04 | Robert Bosch Gmbh | Method and device for operating a vehicle |
| US9555807B2 (en) * | 2015-05-01 | 2017-01-31 | Delphi Technologies, Inc. | Automated vehicle parameter modification based on operator override |
| EP3330664B1 (en) * | 2015-07-29 | 2021-01-27 | KYOCERA Corporation | Parallax calculating device, stereo camera device, vehicle, and parallax calculating method |
| EP3125196B1 (en) * | 2015-07-29 | 2018-02-21 | Continental Automotive GmbH | Drive-by calibration from static targets |
| US20170080948A1 (en) * | 2015-09-18 | 2017-03-23 | Faraday&Future Inc. | Vehicle mode adjusting system |
| EP3358295B1 (en) * | 2015-09-28 | 2020-10-07 | Kyocera Corporation | Image processing device, stereo camera device, vehicle, and image processing method |
| US9720415B2 (en) * | 2015-11-04 | 2017-08-01 | Zoox, Inc. | Sensor-based object-detection optimization for autonomous vehicles |
| US10861177B2 (en) * | 2015-11-11 | 2020-12-08 | Zhejiang Dahua Technology Co., Ltd. | Methods and systems for binocular stereo vision |
| EP3174007A1 (en) * | 2015-11-30 | 2017-05-31 | Delphi Technologies, Inc. | Method for calibrating the orientation of a camera mounted to a vehicle |
| US9956956B2 (en) * | 2016-01-11 | 2018-05-01 | Denso Corporation | Adaptive driving system |
| JP6780661B2 (en) * | 2016-01-15 | 2020-11-04 | ソニー株式会社 | Image processing equipment and methods, programs, and image processing systems |
| US10565868B2 (en) * | 2016-02-18 | 2020-02-18 | Ford Global Technologies, Llc | Method and apparatus for traffic light state alerts |
| US10239527B2 (en) * | 2016-02-24 | 2019-03-26 | Toyota Motor Engineering & Manufacturing North America, Inc. | Driving ratings for leading or following vehicle selection |
| GB2540039A (en) * | 2016-06-03 | 2017-01-04 | Daimler Ag | Method for operating a vehicle as a following vehicle in a platoon |
| US10645366B2 (en) * | 2016-06-10 | 2020-05-05 | Lucid VR, Inc. | Real time re-calibration of stereo cameras |
| US10203408B2 (en) * | 2016-07-29 | 2019-02-12 | Faraday & Future Inc. | Method and apparatus for detection and ranging fault detection and recovery |
| US10853942B1 (en) * | 2016-08-29 | 2020-12-01 | Amazon Technologies, Inc. | Camera calibration in a mobile environment |
| US9996752B2 (en) * | 2016-08-30 | 2018-06-12 | Canon Kabushiki Kaisha | Method, system and apparatus for processing an image |
| US10514711B2 (en) * | 2016-10-09 | 2019-12-24 | Airspace Systems, Inc. | Flight control using computer vision |
| KR102578517B1 (en) * | 2017-01-13 | 2023-09-14 | 삼성전자주식회사 | Electronic apparatus and control method thereof |
| US10031521B1 (en) * | 2017-01-16 | 2018-07-24 | Nio Usa, Inc. | Method and system for using weather information in operation of autonomous vehicles |
| JP2018164199A (en) * | 2017-03-27 | 2018-10-18 | ソニーセミコンダクタソリューションズ株式会社 | Image processing apparatus and image processing method |
| DE112017007347T5 (en) * | 2017-03-30 | 2019-12-12 | Mitsubishi Electric Corporation | OBJECT DETECTION DEVICE AND VEHICLE |
| WO2018221209A1 (en) * | 2017-05-30 | 2018-12-06 | ソニーセミコンダクタソリューションズ株式会社 | Image processing device, image processing method, and program |
| WO2019000417A1 (en) * | 2017-06-30 | 2019-01-03 | SZ DJI Technology Co., Ltd. | Map generation systems and methods |
| EP3435330B1 (en) * | 2017-07-24 | 2021-09-29 | Aptiv Technologies Limited | Vehicule based method of object tracking |
| US11148680B2 (en) * | 2017-08-08 | 2021-10-19 | Ford Global Technologies, Llc | Method and apparatus for user-defined drive mode changes based on occurring conditions |
| US11089288B2 (en) * | 2017-09-11 | 2021-08-10 | Tusimple, Inc. | Corner point extraction system and method for image guided stereo camera optical axes alignment |
| US11158088B2 (en) * | 2017-09-11 | 2021-10-26 | Tusimple, Inc. | Vanishing point computation and online alignment system and method for image guided stereo camera optical axes alignment |
| KR20190063750A (en) * | 2017-11-30 | 2019-06-10 | 주식회사 만도 | Apparatus and method for controlling vehicle based on 3d space cognition |
| JP7163589B2 (en) * | 2018-02-14 | 2022-11-01 | 株式会社デンソー | Driving support device |
| US10580164B2 (en) * | 2018-04-05 | 2020-03-03 | Microsoft Technology Licensing, Llc | Automatic camera calibration |
| US10733233B2 (en) * | 2018-04-11 | 2020-08-04 | GM Global Technology Operations LLC | Method and apparatus for generating situation awareness graphs using cameras from different vehicles |
| DE102018207533B4 (en) * | 2018-05-15 | 2020-01-16 | Conti Temic Microelectronic Gmbh | Method for autonomously operating a following vehicle in a vehicle network |
| WO2020014090A1 (en) * | 2018-07-07 | 2020-01-16 | Peloton Technology, Inc. | Control of automated following in vehicle convoys |
| US10607416B2 (en) * | 2018-08-30 | 2020-03-31 | Valeo Comfort And Driving Assistance | Conditional availability of vehicular mixed-reality |
| WO2020159245A1 (en) * | 2019-01-31 | 2020-08-06 | 엘지전자 주식회사 | Method for sharing images between vehicles |
| US20200372792A1 (en) * | 2019-05-24 | 2020-11-26 | E-Motion Inc. | Crowdsourced realtime traffic images and videos |
| WO2020263498A1 (en) * | 2019-06-25 | 2020-12-30 | Snap Inc. | Vanishing point stereoscopic image correction |
| US11796654B2 (en) * | 2019-09-04 | 2023-10-24 | Qualcomm Incorporated | Distributed sensor calibration and sensor sharing using cellular vehicle-to-everything (CV2X) communication |
| KR102785227B1 (en) * | 2020-01-22 | 2025-03-25 | 노다르 인크. | Non-rigid stereo vision camera system |
| US11892560B2 (en) * | 2020-02-03 | 2024-02-06 | Nio Technology (Anhui) Co., Ltd | High precision multi-sensor extrinsic calibration via production line and mobile station |
| CN113847930A (en) * | 2020-06-28 | 2021-12-28 | 图森有限公司 | Multi-sensor calibration system |
| US20220194428A1 (en) * | 2020-12-17 | 2022-06-23 | 6 River Systems, Llc | Systems and methods for calibrating sensors of autonomous vehicles |
-
2019
- 2019-07-12 WO PCT/US2019/041720 patent/WO2020014683A1/en not_active Ceased
- 2019-07-15 US US16/511,968 patent/US11994861B2/en active Active
-
2020
- 2020-02-20 US US16/796,334 patent/US12025981B2/en active Active
- 2020-02-20 US US16/796,349 patent/US20210018918A1/en not_active Abandoned
- 2020-02-20 US US16/796,785 patent/US12093040B2/en active Active
- 2020-02-20 US US16/796,243 patent/US20200192372A1/en not_active Abandoned
- 2020-02-20 US US16/796,306 patent/US20200192375A1/en not_active Abandoned
- 2020-02-20 US US16/796,319 patent/US20200183395A1/en active Pending
- 2020-02-20 US US16/796,794 patent/US12298764B2/en active Active
- 2020-02-20 US US16/796,289 patent/US11573569B2/en active Active
- 2020-02-20 US US16/796,754 patent/US20200201329A1/en not_active Abandoned
- 2020-02-20 US US16/796,765 patent/US20200192378A1/en not_active Abandoned
- 2020-02-20 US US16/796,775 patent/US20200192379A1/en not_active Abandoned
- 2020-02-20 US US16/796,264 patent/US11422556B2/en active Active
- 2020-02-20 US US16/796,730 patent/US20200192376A1/en not_active Abandoned
- 2020-02-20 US US16/796,744 patent/US12093039B2/en active Active
-
2022
- 2022-08-22 US US17/821,346 patent/US20230084316A1/en active Pending
- 2022-12-28 US US18/147,632 patent/US20230259131A1/en not_active Abandoned
-
2023
- 2023-11-17 US US18/513,240 patent/US20240085909A1/en active Pending
- 2023-12-21 US US18/392,733 patent/US20240142977A1/en active Pending
Patent Citations (51)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20010095460A (en) | 2000-03-30 | 2001-11-07 | 한민홍 | The method and device of vehicle control to follow the rear of the previous car with a fixed distance at low speed. |
| US20150081156A1 (en) | 2008-10-24 | 2015-03-19 | Gray & Company, Inc. | Control and systems for autonomously driven vehicles |
| US20130073138A1 (en) | 2010-05-19 | 2013-03-21 | Glenn Michael Callow | System validation |
| US20140324268A1 (en) | 2010-10-05 | 2014-10-30 | Google Inc. | Zone driving |
| US20150105961A1 (en) | 2011-09-15 | 2015-04-16 | Bae Systems Plc | Autonomous vehicle and task modelling |
| US10564639B1 (en) | 2012-02-06 | 2020-02-18 | Waymo Llc | System and method for predicting behaviors of detected objects through environment representation |
| US20150178578A1 (en) * | 2013-12-19 | 2015-06-25 | Harman International Industries, Incorporated | Vehicle behavior analysis |
| US20160068158A1 (en) | 2014-09-10 | 2016-03-10 | Ford Global Technologies, Llc | Automatic park and reminder system and method of use |
| US20160068156A1 (en) | 2014-09-10 | 2016-03-10 | Volkswagen Ag | Modifying autonomous vehicle driving by recognizing vehicle characteristics |
| US9524648B1 (en) * | 2014-11-17 | 2016-12-20 | Amazon Technologies, Inc. | Countermeasures for threats to an uncrewed autonomous vehicle |
| US20210034068A1 (en) | 2016-01-05 | 2021-02-04 | Mobileye Vision Technologies Ltd. | Prioritized constraints for a navigational system |
| US20170267237A1 (en) | 2016-03-17 | 2017-09-21 | Fuji Jukogyo Kabushiki Kaisha | Vehicle travel control device |
| US20210056327A1 (en) | 2016-06-27 | 2021-02-25 | Mobileye Vision Technologies Ltd. | Controlling host vehicle based on detected door opening events |
| US20180060555A1 (en) | 2016-08-24 | 2018-03-01 | Bragi GmbH | Digital signature using phonometry and compiled biometric data system and method |
| US20180096541A1 (en) | 2016-10-04 | 2018-04-05 | Wal-Mart Stores, Inc. | System and Methods for Drone-Based Vehicle Status Determination |
| US20180107215A1 (en) | 2016-10-17 | 2018-04-19 | Uber Technologies, Inc. | Neural network system for autonomous vehicle control |
| US20210072765A1 (en) | 2016-10-31 | 2021-03-11 | Mobileye Vision Technologies Ltd. | Systems and methods for navigating lane merges and lane splits |
| US20210110483A1 (en) | 2016-12-23 | 2021-04-15 | Mobileye Vision Technologies Ltd. | Navigation Based on Liability Constraints |
| US20180188742A1 (en) | 2016-12-30 | 2018-07-05 | DeepMap Inc. | Representing navigable surface boundaries of lanes in high definition maps for autonomous vehicles |
| US20180194286A1 (en) | 2017-01-12 | 2018-07-12 | Mobileye Vision Technologies Ltd. | Determining a road surface characteristic |
| US20190329768A1 (en) | 2017-01-12 | 2019-10-31 | Mobileye Vision Technologies Ltd. | Navigation Based on Detected Size of Occlusion Zones |
| US20180237012A1 (en) * | 2017-02-22 | 2018-08-23 | Ford Global Technologies, Llc | Autonomous vehicle towing |
| US20190377354A1 (en) | 2017-03-01 | 2019-12-12 | Mobileye Vision Technologies Ltd. | Systems and methods for navigating with sensing uncertainty |
| US20190369637A1 (en) | 2017-03-20 | 2019-12-05 | Mobileye Vision Technologies Ltd. | Trajectory selection for an autonomous vehicle |
| US20180348763A1 (en) | 2017-06-02 | 2018-12-06 | Baidu Usa Llc | Utilizing rule-based and model-based decision systems for autonomous driving control |
| US20180349782A1 (en) | 2017-06-06 | 2018-12-06 | PlusAI Corp | Method and system for close loop perception in autonomous driving vehicles |
| US20200348672A1 (en) | 2017-06-14 | 2020-11-05 | Mobileye Vision Technologies Ltd. | Safety and comfort constraints for navigation |
| US20190025843A1 (en) | 2017-07-18 | 2019-01-24 | Uber Technologies, Inc. | Systems and Methods for Speed Limit Context Awareness |
| US20190088135A1 (en) * | 2017-09-15 | 2019-03-21 | Qualcomm Incorporated | System and method for relative positioning based safe autonomous driving |
| US20190086914A1 (en) * | 2017-09-15 | 2019-03-21 | GM Global Technology Operations LLC | Systems and methods for collaboration between autonomous vehicles |
| US20200223451A1 (en) | 2017-10-30 | 2020-07-16 | Mobileye Vision Technologies Ltd. | Navigation based on sensed looking direction of a pedestrian |
| US20190258251A1 (en) | 2017-11-10 | 2019-08-22 | Nvidia Corporation | Systems and methods for safe and reliable autonomous vehicles |
| US20190147255A1 (en) | 2017-11-15 | 2019-05-16 | Uber Technologies, Inc. | Systems and Methods for Generating Sparse Geographic Data for Autonomous Vehicles |
| US20190206260A1 (en) | 2017-12-28 | 2019-07-04 | Bendix Commercial Vehicle Systems Llc | Initialization and safety maintenance strategy for platooning vehicles |
| US20210271254A1 (en) | 2018-01-07 | 2021-09-02 | Nvidia Corporation | Guiding vehicles through vehicle maneuvers using machine learning models |
| US20210179096A1 (en) | 2018-03-20 | 2021-06-17 | Mobileye Vision Technologies Ltd. | Systems and methods for navigating a vehicle |
| US20190384303A1 (en) | 2018-06-19 | 2019-12-19 | Nvidia Corporation | Behavior-guided path planning in autonomous machine applications |
| US20200183395A1 (en) | 2018-07-13 | 2020-06-11 | Kache.AI | System and method for automatically identifying an issue from sensor data from an autonomous drive system while the vehicle is static |
| US20210018918A1 (en) | 2018-07-13 | 2021-01-21 | Kache.AI | System and method for improving an autonomous vehicle driving model based on driver engagement conditions |
| US20200192377A1 (en) | 2018-07-13 | 2020-06-18 | Kache.AI | System and method for automatically determining to follow a divergent vehicle in a vehicle's autonomous driving mode |
| US20200192379A1 (en) | 2018-07-13 | 2020-06-18 | Kache.AI | System and method for automatically following a lane while in a vehicle's autonomous driving mode |
| US20200201329A1 (en) | 2018-07-13 | 2020-06-25 | Kache.AI | System and method for automatically following a vehicle in a vehicle's autonomous driving mode based on received instructions |
| US20200192381A1 (en) | 2018-07-13 | 2020-06-18 | Kache.AI | System and method for calibrating camera data using a second image sensor from a second vehicle |
| US20200192378A1 (en) | 2018-07-13 | 2020-06-18 | Kache.AI | System and method for automatically switching a vehicle to follow in a vehicle's autonomous driving mode |
| US20200192376A1 (en) | 2018-07-13 | 2020-06-18 | Kache.AI | System and method for automatically determining to follow a vehicle in a vehicle's autonomous driving mode |
| US20200192375A1 (en) | 2018-07-13 | 2020-06-18 | Kache.AI | System and method for updating a ros node in a convolutional neural network |
| US20200192373A1 (en) | 2018-07-13 | 2020-06-18 | Kache.AI | System and method for detecting a condition prompting an update to an autonomous vehicle driving model |
| US20200192374A1 (en) | 2018-07-13 | 2020-06-18 | Kache.AI | System and method for updating an autonomous vehicle driving model based on the vehicle driving model becoming statistically incorrect |
| US20200192372A1 (en) | 2018-07-13 | 2020-06-18 | Kache.AI | System and method for updating an autonomous vehicle driving model based on a change in time and/or location |
| US20200192380A1 (en) | 2018-07-13 | 2020-06-18 | Kache.AI | System and method for calibrating an autonomous vehicle camera |
| US20200019165A1 (en) | 2018-07-13 | 2020-01-16 | Kache.AI | System and method for determining a vehicles autonomous driving mode from a plurality of autonomous modes |
Non-Patent Citations (2)
| Title |
|---|
| PCT International Preliminary Report on Patentability in International Application PCT/US2019/041720, dated Jan. 28, 2021, 13 pages. |
| PCT International Search Report and Written Opinion in International Application PCT/US2019/041720, dated Dec. 20, 2019, 19 pages. |
Also Published As
| Publication number | Publication date |
|---|---|
| US12093039B2 (en) | 2024-09-17 |
| US20240142977A1 (en) | 2024-05-02 |
| US20210018918A1 (en) | 2021-01-21 |
| US11994861B2 (en) | 2024-05-28 |
| US20200192378A1 (en) | 2020-06-18 |
| US12093040B2 (en) | 2024-09-17 |
| US12298764B2 (en) | 2025-05-13 |
| US20230084316A1 (en) | 2023-03-16 |
| US20200192381A1 (en) | 2020-06-18 |
| US20200192379A1 (en) | 2020-06-18 |
| US20210018917A1 (en) | 2021-01-21 |
| US20230259131A1 (en) | 2023-08-17 |
| US11573569B2 (en) | 2023-02-07 |
| US20240085909A1 (en) | 2024-03-14 |
| US20200192377A1 (en) | 2020-06-18 |
| US20200192374A1 (en) | 2020-06-18 |
| US11422556B2 (en) | 2022-08-23 |
| US20200192380A1 (en) | 2020-06-18 |
| US20200192373A1 (en) | 2020-06-18 |
| US20200183395A1 (en) | 2020-06-11 |
| US20200201329A1 (en) | 2020-06-25 |
| US20200192372A1 (en) | 2020-06-18 |
| US20200019165A1 (en) | 2020-01-16 |
| WO2020014683A1 (en) | 2020-01-16 |
| US20200192376A1 (en) | 2020-06-18 |
| US20200192375A1 (en) | 2020-06-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20240085909A1 (en) | System and method for updating an autonomous vehicle driving model based on the vehicle driving model becoming statistically incorrect | |
| US11815623B2 (en) | Single frame 4D detection using deep fusion of camera image, imaging RADAR and LiDAR point cloud | |
| US11733353B2 (en) | Object detection using local (ground-aware) adaptive region proposals on point clouds | |
| US10992860B2 (en) | Dynamic seam adjustment of image overlap zones from multi-camera source images | |
| US11726189B2 (en) | Real-time online calibration of coherent doppler lidar systems on vehicles | |
| US10946868B2 (en) | Methods and devices for autonomous vehicle operation | |
| US10586254B2 (en) | Method and system for adaptive vehicle control in autonomous vehicles | |
| US10982968B2 (en) | Sensor fusion methods for augmented reality navigation | |
| US20190113916A1 (en) | Vehicle path-planner monitor and controller | |
| US11560131B2 (en) | Lane prediction and smoothing for extended motion planning horizon | |
| US20230199450A1 (en) | Autonomous Vehicle Communication Gateway Architecture | |
| RU2775817C2 (en) | Method and system for training machine learning algorithm for detecting objects at a distance |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| AS | Assignment |
Owner name: KACHE.AI, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEVANDOWSKI, ANTHONY;BERNSTEIN, DAVID;ARGUETA, OSCAR;AND OTHERS;REEL/FRAME:064005/0457 Effective date: 20180827 |
|
| AS | Assignment |
Owner name: KACHE.AI, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CULKIN, CATHERINE;REEL/FRAME:064040/0917 Effective date: 20191106 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |