US20170217433A1 - Tracking objects within a dynamic environment for improved localization - Google Patents
Tracking objects within a dynamic environment for improved localization Download PDFInfo
- Publication number
- US20170217433A1 US20170217433A1 US15/010,303 US201615010303A US2017217433A1 US 20170217433 A1 US20170217433 A1 US 20170217433A1 US 201615010303 A US201615010303 A US 201615010303A US 2017217433 A1 US2017217433 A1 US 2017217433A1
- Authority
- US
- United States
- Prior art keywords
- vehicle
- objects
- dynamic
- sensor data
- dynamic environment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/04—Traffic conditions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
- B60W30/10—Path keeping
- B60W30/12—Lane keeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/0097—Predicting future conditions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3407—Route searching; Route guidance specially adapted for specific applications
- G01C21/3415—Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3453—Special cost functions, i.e. other than distance or default speed limit of road segments
- G01C21/3492—Special cost functions, i.e. other than distance or default speed limit of road segments employing speed data or traffic data, e.g. real-time or historical
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/027—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0062—Adapting control system settings
- B60W2050/0075—Automatic parameter input, automatic initialising or calibrating means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/20—Static objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/404—Characteristics
- B60W2554/4041—Position
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/404—Characteristics
- B60W2554/4044—Direction of movement, e.g. backwards
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/10—Historical data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0953—Predicting travel path or likelihood of collision the prediction being responsive to vehicle dynamic parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
Definitions
- This invention relates generally to the field of vehicle navigation systems, and, more particularly, to vehicle navigation systems which can be utilized when lane line markings have become degraded, obscured, or are nonexistent.
- Active safety and driver assist features such as lane departure warning, low-speed lane keeping (Traffic Jam Assist—TJA), high speed lane keeping (Highway Assist—HA) as well as fully autonomous vehicle operation rely upon localization of the vehicle within the lane to provide their functionality. Localization is defined as a computational problem of constructing or updating a map of an unknown environment while simultaneously keeping track of a vehicle's location within it.
- TJA Traffic Jam Assist
- HA Highway Assist
- HA Advanced Driver assist features
- Localization is defined as a computational problem of constructing or updating a map of an unknown environment while simultaneously keeping track of a vehicle's location within it.
- each of these systems relies upon multiple sensor suites to provide robust and accurate positioning. Examples of currently relied upon sensor suites are: camera, stereo cameras, Global Positioning System (GPS), and LIDAR.
- GPS Global Positioning System
- GPS Global Positioning System
- GPS Global Positioning System
- GPS Global Positioning System
- GPS Global Positioning System
- IMU Inertial Measurement Unit
- An improved reckoning of position state can be achieved via odometry from the vehicle's four wheel-speed sensors. While these sensors provide a robust estimate of longitudinal position, they are unable to accurately estimate lateral position changes of the vehicle. Thus, in the event of faulty or nonexistent lane level perception data, limited, if any, solutions exist for continued operation of the aforementioned active safety and vehicle assist features.
- FIG. 1 illustrates an example block diagram of a computing device.
- FIG. 2 illustrates an example computer architecture that facilitates tracking objects within a dynamic environment for improved localization.
- FIG. 3 illustrates a flow chart of an example method for tracking objects within a dynamic environment for improved localization.
- FIG. 4 illustrates another flow chart of an example method for tracking objects within a dynamic environment for improved localization.
- FIG. 5 illustrates an example position interface module
- FIG. 6 illustrates a portion of a roadway.
- the present invention extends to methods, systems, and computer program products for tracking objects within a dynamic environment for improved localization.
- Embodiments of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below.
- Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures.
- Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system.
- Computer-readable media that store computer-executable instructions are computer storage media (devices).
- Computer-readable media that carry computer-executable instructions are transmission media.
- embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media (devices) and transmission media.
- Computer storage media includes RAM, ROM, EEPROM, CD-ROM, solid state drives (“SSDs”) (e.g., based on RAM), Flash memory, phase-change memory (“PCM”), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
- SSDs solid state drives
- PCM phase-change memory
- a “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices.
- a network or another communications connection can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
- program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (devices) (or vice versa).
- computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media (devices) at a computer system.
- RAM can also include solid state drives (SSDs or PCIx based real time memory tiered Storage, such as FusionIO).
- SSDs solid state drives
- PCIx based real time memory tiered Storage such as FusionIO
- Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
- the computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
- the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like.
- the invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
- program modules may be located in both local and remote memory storage devices.
- Embodiments of the invention can also be implemented in cloud computing environments.
- cloud computing is defined as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned via virtualization and released with minimal management effort or service provider interaction, and then scaled accordingly.
- configurable computing resources e.g., networks, servers, storage, applications, and services
- a cloud model can be composed of various characteristics (e.g., on-demand self-service, broad network access, resource pooling, rapid elasticity, measured service, etc.), service models (e.g., Software as a Service (SaaS), Platform as a Service (PaaS), Infrastructure as a Service (IaaS), and deployment models (e.g., private cloud, community cloud, public cloud, hybrid cloud, etc.). Databases and servers described with respect to the present invention can be included in a cloud model.
- service models e.g., Software as a Service (SaaS), Platform as a Service (PaaS), Infrastructure as a Service (IaaS)
- deployment models e.g., private cloud, community cloud, public cloud, hybrid cloud, etc.
- ASICs application specific integrated circuits
- a “vehicle configuration” is defined as the configuration of a vehicle including one or more of: vehicle acceleration, vehicle velocity, vehicle position, and vehicle direction.
- aspects of the invention are directed to tracking objects within a dynamic environment for improved localization.
- Sensing devices are utilized to gather data about a vehicle's environment.
- a vehicle computer system uses previously detected sensor data to estimate the speed and direction of travel of dynamic (e.g., moving) objects.
- the computer system estimates the location of the dynamic objects after a specified period of time based on the estimated speed and direction of the dynamic objects.
- the computer system utilizes this information, as well as currently measured static objects, to localize the vehicle within the dynamic environment and to control the configuration of the vehicle.
- FIG. 1 illustrates an example block diagram of a computing device 100 .
- Computing device 100 can be used to perform various procedures, such as those discussed herein.
- Computing device 100 can function as a server, a client, or any other computing entity.
- Computing device 100 can perform various communication and data transfer functions as described herein and can execute one or more application programs, such as the application programs described herein.
- Computing device 100 can be any of a wide variety of computing devices, such as a mobile telephone or other mobile device, a desktop computer, a notebook computer, a server computer, a handheld computer, tablet computer and the like.
- Computing device 100 includes one or more processor(s) 102 , one or more memory device(s) 104 , one or more interface(s) 106 , one or more mass storage device(s) 108 , one or more Input/Output (I/O) device(s) 110 , and a display device 130 all of which are coupled to a bus 112 .
- Processor(s) 102 include one or more processors or controllers that execute instructions stored in memory device(s) 104 and/or mass storage device(s) 108 .
- Processor(s) 102 may also include various types of computer storage media, such as cache memory.
- Memory device(s) 104 include various computer storage media, such as volatile memory (e.g., random access memory (RAM) 114 ) and/or nonvolatile memory (e.g., read-only memory (ROM) 116 ). Memory device(s) 104 may also include rewritable ROM, such as Flash memory.
- volatile memory e.g., random access memory (RAM) 114
- ROM read-only memory
- Memory device(s) 104 may also include rewritable ROM, such as Flash memory.
- Mass storage device(s) 108 include various computer storage media, such as magnetic tapes, magnetic disks, optical disks, solid state memory (e.g., Flash memory), and so forth. As depicted in FIG. 1 , a particular mass storage device is a hard disk drive 124 . Various drives may also be included in mass storage device(s) 108 to enable reading from and/or writing to the various computer readable media. Mass storage device(s) 108 include removable media 126 and/or non-removable media.
- I/O device(s) 110 include various devices that allow data and/or other information to be input to or retrieved from computing device 100 .
- Example I/O device(s) 110 include cursor control devices, keyboards, keypads, barcode scanners, microphones, monitors or other display devices, speakers, printers, network interface cards, modems, cameras, lenses, CCDs or other image capture devices, and the like.
- Display device 130 includes any type of device capable of displaying information to one or more users of computing device 100 .
- Examples of display device 130 include a monitor, display terminal, video projection device, and the like.
- Interface(s) 106 include various interfaces that allow computing device 100 to interact with other systems, devices, or computing environments as well as humans.
- Example interface(s) 106 can include any number of different network interfaces 120 , such as interfaces to personal area networks (PANs), local area networks (LANs), wide area networks (WANs), wireless networks (e.g., near field communication (NFC), Bluetooth, Wi-Fi, etc., networks), and the Internet.
- Other interfaces include user interface 118 and peripheral device interface 122 .
- Bus 112 allows processor(s) 102 , memory device(s) 104 , interface(s) 106 , mass storage device(s) 108 , and I/O device(s) 110 to communicate with one another, as well as other devices or components coupled to bus 112 .
- Bus 112 represents one or more of several types of bus structures, such as a system bus, PCI bus, IEEE 1394 bus, USB bus, and so forth.
- a vehicle is outfitted with one or more radar systems.
- the one or more radar systems can be included in active safety and driver assist features such as lane departure warning, low-speed lane keeping (Traffic Jam Assist—TJA), high speed lane keeping (Highway Assist—HA), adaptive cruise control, etc.
- TJA Traffic Jam Assist
- HA High speed lane keeping
- adaptive cruise control etc.
- the one or more radar systems can be leveraged to improve position estimation and localization, supplement other perception sensor suites, and improve the robustness of active safety/driver assist systems as a whole.
- distance measuring sensors such as radar/LIDAR can be used to aid in position estimation and localization. Measurements can be taken at discrete points in time and compared to one another. By comparing (e.g., two consecutive) scans, it is possible to estimate a vehicle's motion in time. Methods of comparison can include but are not limited to: iterative closest point (ICP).
- ICP iterative closest point
- evaluation of radar/LIDAR scans can: (1) estimate the speed and direction of travel of dynamic objects, (2) propagate these dynamic objects forward by the amount of time between two scans of the distance measuring sensors, and (3) consider propagated objects from the previous scan, as well as currently measured static objects, for the localization algorithm.
- Sensor suites including but not limited to radar and ultrasonic sensors, as well as LIDAR and Camera sensors utilizing post processing techniques, can be used to estimate speed and direction of travel of dynamic (e.g., moving) objects.
- Algorithms including but not limited to: clustering techniques, nearest closest point methods, as well as Kalman filter techniques can be used to propagate dynamic objects between sensor scans. Accordingly, by leveraging the estimated trajectories of dynamic objects, the dynamic objects can be propagated forward in time, allowing for more accurate position reckoning for a vehicle.
- FIG. 2 illustrates an example computer architecture in a vehicle 200 that facilitates tracking objects within a dynamic environment for improved localization.
- Vehicle 200 can be a motorized vehicle, such as, for example, a car, a truck, a bus, or a motorcycle.
- vehicle 200 includes vehicle computer system 201 and sensor devices 211 .
- Each of the computer system 201 and sensor devices 211 can be connected to one another over (or be part of) a network, such as, for example, a PAN, a LAN, a WAN, a controller area network (CAN) bus or other in-vehicle bus, and even the Internet.
- a network such as, for example, a PAN, a LAN, a WAN, a controller area network (CAN) bus or other in-vehicle bus, and even the Internet.
- CAN controller area network
- each of the computer system 201 and sensor devices 211 can create message related data and exchange message related data (e.g., near field communication (NFC) payloads, Bluetooth packets, Internet Protocol (IP) datagrams and other higher layer protocols that utilize IP datagrams, such as, Transmission Control Protocol (TCP), Hypertext Transfer Protocol (HTTP), Simple Mail Transfer Protocol (SMTP), etc.) over the network.
- message related data e.g., near field communication (NFC) payloads, Bluetooth packets, Internet Protocol (IP) datagrams and other higher layer protocols that utilize IP datagrams, such as, Transmission Control Protocol (TCP), Hypertext Transfer Protocol (HTTP), Simple Mail Transfer Protocol (SMTP), etc.
- NFC near field communication
- IP Internet Protocol
- TCP Transmission Control Protocol
- HTTP Hypertext Transfer Protocol
- SMTP Simple Mail Transfer Protocol
- Sensor devices 211 can include a radar system 212 , an image-capture device 213 , a inertial navigation system 214 , a map 218 , and a LIDAR system 219 .
- Inertial navigation system 214 can further including a global positioning system (GPS) 215 , an inertial measurement unit (IMU) 216 , and a dead reckoning (DR) system 217 .
- GPS global positioning system
- IMU inertial measurement unit
- DR dead reckoning
- Other types of sensor devices (not shown), such as, for example, ultrasonic sensors and infrared sensors, can also be included in sensor devices 211 .
- Each of the sensor devices is configured to capture sensor data of a specified type by sensing objects in the vicinity of vehicle 200 .
- image-capture device 213 is configured to capture image data
- LIDAR system 219 is configured to capture LIDAR data, and so forth.
- Each of the respective types of sensor data can be transmitted to vehicle computer system 201 .
- vehicle computer system 201 includes an interface position module 231 and a control system module 235 .
- Interface position module 231 is configured to receive sensor data from sensor devices 211 .
- Interface position module 231 further includes a sensor data evaluation module 232 , an estimation module 233 , and a localization module 234 .
- Sensor data evaluation module 232 is configured to process and evaluate sensor data received from sensor devices 211 .
- sensor data evaluation module 232 can process sensor data to identify road lane markings, objects in the vehicle environment, including static and dynamic (e.g., moving) objects, and information about the vehicle's configuration, including position on the road, trajectory, and velocity.
- sensor data evaluation module 232 can determine when road lane markings have become degraded, obscured, or are nonexistent.
- Estimation module 233 is configured to utilize sensor data to identify both static and dynamic objects in the vehicle environment. Accordingly, when sensor data evaluation module 232 determines that road lane markings are degraded, obscured, or are non-existent, estimation module 233 can utilize sensor data to identify static and dynamic objects in an area around vehicle 200 . Furthermore, estimation module 233 is configured to estimate the speed and direction of travel of dynamic objects at different time steps.
- Localization module 234 is configured to utilize the results of estimation module 233 to determine position reckoning for improved localization of vehicle 200 within an environment containing other static and/or dynamic objects.
- Control system module 235 is configured to utilize the localization results of position interface module 231 to control the vehicle's configuration; including the vehicle's location, trajectory, and velocity.
- FIG. 3 illustrates a flow chart of an example method 300 for tracking objects within a dynamic environment for improved localization. Method 300 will be described with respect to the components and data of computer architecture 200 .
- each of the sensing devices 211 can sense road marking information as well as static and dynamic objects in an environment around vehicle 200 .
- image-capture device 213 can capture image data of other objects within the environment
- LIDAR system 219 can capture LIDAR data of other objects within the environment, and so forth.
- sensor data 221 can indicate the configuration of any static objects and/or dynamic objects within and/or around a portion of roadway where vehicle 200 is traveling.
- Static objects can include signs, posts, mile markers, street lights, trees, medians, guard rails, rocks, stationary (e.g., parked) vehicles, road construction equipment, etc.
- Dynamic objects can include other moving vehicles, pedestrians, cyclists, etc.
- Sensor devices 211 can transmit sensor data 221 to vehicle computer system 201 .
- Position interface module 231 can receive sensor data 221 from sensor devices 211 .
- Method 300 includes detecting that sensor data for objects within the dynamic environment has degraded, the sensor data having been gathered by a plurality of sensors at the vehicle, the sensor data indicating the configuration of objects within the dynamic environment, the objects including one or more static objects and one or more dynamic objects ( 301 ).
- sensor data evaluation module 232 can detect that sensor data 221 has degraded.
- Sensor data evaluation module 232 can process and evaluate sensor data 221 to identify road lane markings, static objects in the vehicle environment, dynamic objects in the vehicle environment, and information about the vehicle's configuration. From processing and evaluation, sensor data evaluation module 232 can determine when road lane markings have become degraded, obscured, or are non-existent (and thus may inhibit the operation of other automated systems of vehicle 200 , such as, for example, a lane assist system).
- method 300 includes estimating a speed and direction of travel for the dynamic object from previously detected sensor data ( 302 ).
- estimation module 233 can utilize sensor data 221 to estimate the speed and direction of travel other dynamic objects within and/or around the portion of roadway where vehicle 200 is traveling.
- estimation module 233 can estimate the speed and direction of travel of the dynamic objects at different time steps.
- estimation module 233 can utilize sensor data 221 (e.g., from distance measuring sensors such as radar and/or LIDAR) to aid in speed and direction estimation.
- estimation module 233 can compare sensor measurements taken at two discrete points in time.
- Estimation module 233 can compare the two consecutive scans to estimate speed and direction of travel for one or more dynamic objects. Methods of comparison can include iterative closest point (ICP) as well as other algorithms.
- ICP iterative closest point
- method 300 includes estimating the location of the dynamic object after a specified period of time based on the estimated speed and direction of the dynamic object ( 303 ).
- estimation module 233 can estimate the location of one or more dynamic objects within and/or around the portion of roadway where vehicle 200 is traveling after a specified period of time. Estimated locations for each dynamic object can be calculated based on estimated speed and direction for the dynamic object.
- method 300 includes localizing the vehicle within the dynamic environment based on the estimated positions for the one or more moving objects and the positions of the one or more static objects ( 304 ).
- localization module 234 can utilize the results from estimation module 233 to localize vehicle 200 within and/or around the portion of roadway where vehicle 200 is traveling. Localization can be based on estimated positions for other dynamic objects and/or other static objects within and/or around the portion of roadway where vehicle 200 is traveling.
- Position interface module 231 can send the localization of vehicle 200 to control system module 235 .
- Control system module 235 can receive the localization of vehicle 200 from position interface module 231 .
- position interface module 231 essentially creates a map of a dynamic environment, such as, for example, other dynamic objects and/or other static objects within and/or around the portion of roadway where vehicle 200 is traveling.
- the map can be based on one or more of a lane marking on the road, a geographic location of the vehicle, and a predetermined map of the road.
- a new position for dynamic objects e.g., other moving vehicles
- Method 300 includes using the localization to control the configuration of the vehicle within the dynamic environment ( 305 ).
- control system module 235 can use the location of vehicle 200 to control the configuration of vehicle 200 within and/or around the portion of roadway where vehicle 200 is traveling. Controlling the configuration of vehicle 200 can include accelerating, decelerating, maintaining speed, changing direction, maintaining direction, braking, etc. Control system module 235 can control other vehicle systems, such as, for example, cruise control, to control the configuration of vehicle 200 .
- position interface module 231 can: (1) estimate the speed and direction of travel of dynamic objects, (2) calculate the predicted location of the dynamic objects by propagating the dynamic objects between sensor scans, and (3) utilize the predicted locations of dynamic objects as well as locations static objects to compensate for degraded, obscured, or nonexistent lane markings.
- aspects of the invention include robust position reckoning within a dynamic environment in where a vehicle is operating.
- FIG. 4 illustrates another flow chart 400 of an example method for tracking objects within a dynamic environment for improved localization.
- Various components in vehicle 200 can interoperate to implement method 400 .
- Method 400 includes measuring distance data ( 401 ). For example, one or more of sensors 211 can measure distances to other objects within and/or around the portion of roadway where vehicle 200 is traveling. Sensor data from the one or more sensors can be combined in sensor data 221 . Method 400 includes receiving and processing sensor hit data from objects ( 402 ). For example, sensor data evaluation module 232 can receive and process sensor data 221 .
- Method 400 includes detecting clustered objects ( 403 ). For example, based on sensor data 221 , estimation module 233 can detect clusters of static and/or dynamic objects within and/or around the portion of roadway where vehicle 200 is traveling. Method 400 includes evaluating if an object is dynamic or static ( 404 ). For example, for each object in a cluster, estimation module 233 can determine if the object is static or dynamic. Method 400 includes estimating speed and direction for dynamic objects ( 405 ). For example, for each dynamic object within and/or around the portion of roadway where vehicle 200 is traveling, estimation module 233 can estimate the speed and direction for the dynamic object. Method 400 includes adding a predicted location for the dynamic object to list of static objects ( 406 ). For example, estimation module 233 can add a predicted location for each dynamic object to a list of locations for static objects. As such, for a specified future time, the location of any objects within and/or around the portion of roadway where vehicle 200 is traveling can be estimated.
- Method 400 includes utilizing a localization algorithm ( 407 ).
- localization module 234 can utilize a localization algorithm to localize vehicle 200 within and/or around the portion of roadway where vehicle 200 is traveling. Vehicle 200 can be localized based on estimated locations for any objects within and/or around the portion of roadway where vehicle 200 is traveling at the specified future time.
- FIG. 5 illustrates an example position interface module 501 .
- Position interface module 501 can receive sensor data from a variety of different vehicle sensors at a vehicle, including any of: a radar 511 , a camera 512 , INS (GPS+IMU+DR) 513 , a map drive history 514 , and a LIDAR 515 .
- Position interface module 501 can use a sensor fusion algorithm to localize the vehicle in an environment based on received sensor data. Localization of the vehicle can be represented by a lane level localization 521 , a confidence level 522 , and a fault status 523 .
- Lane level localization 521 can localize the vehicle to a specified roadway lane within some margin of error (e.g., 0.5 m ⁇ 2 m).
- Confidence interval 522 can indicate how confident position interface module 501 is in lane level localization 521 .
- Fault status 523 can indicate if position interface module 501 experienced a fault during determination of lane level localization 5
- FIG. 6 illustrates a portion of a roadway 602 .
- roadway 602 includes lanes 603 , 604 , 605 , and 606 .
- Roadway 602 includes a number of static objects including trees 670 , 671 , and 675 , signs 673 and 674 , and a parked vehicle 672 .
- Roadway 602 also includes moving vehicles 650 , 660 , 661 , 662 , and 663 .
- vehicles 660 and 661 are traveling in essentially the same direction as vehicle 650 .
- vehicles 662 and 663 are traveling in essentially the opposite direction of vehicle 650 .
- Roadway 602 also includes cross-walk 676 .
- Vehicle 650 includes a variety of sensors including an image capture device 651 , a LIDAR system 652 , a radar system 653 , a map 654 , a GPS 655 , and an Inertial Measurement Unit (IMU) 656 .
- Vehicle 650 can also include a computer system similar to vehicle computer system 201 and/or a position interface module similar to position interface module 231 and/or position interface module 501 .
- the sensors can detect the other static objects and the other dynamic objects on roadway 602 .
- the sensors can also detect the lane markings for lane 602 including lane markings 610 , 620 , and 630 .
- lane markings for lane 602 can degrade and become less visible as indicated by lane markings 611 , 621 , 631 .
- vehicle 650 can predict the location of vehicles 660 , 661 , 662 , and 663 on roadway 602 at a future point in time.
- Vehicle 650 can combine the predicted future locations of vehicles 660 , 661 , 662 , and 663 with the locations of trees 670 , 671 , and 675 , signs 673 and 674 , and parked vehicle 670 to estimate the environment of roadway 602 at the future point in time.
- Vehicle 650 can use the estimated environment to compensate for the degradation of lane lines 611 , 621 , and 631 , such as, for example, maintaining vehicle 650 in a safe configuration.
- vehicle 650 predicts that vehicle 660 is essentially at the same distance straight in front of vehicle 650 at the future point time, vehicle 650 has some level of confidence that it can safely remain in lane 603 if maintaining a current configuration. If vehicle 650 predicts that vehicle 660 is at a lesser distance straight in front of vehicle 650 at the future point time, vehicle 650 has some level of confidence that it can safely remain in lane 603 if it reduces speed. Depending on other predicted future locations of dynamic and static objects in roadway 602 , vehicle 650 can change to a safe configuration in other ways, such as, for example, changing direction, accelerating, coming to a complete stop, etc.
Abstract
The present invention extends to methods, systems, and computer program products for tracking objects within a dynamic environment for improved localization. Sensing devices are utilized to gather data about a vehicle's environment. In cases where the sensor data has become degraded, such as data indicating that lane lines have become degraded, obscured, or nonexistent, the vehicle computer system uses previously detected sensor data to estimate the speed and direction of travel of moving objects. The computer system then estimates the location of the moving objects after a specified period of time based on the estimated speed and direction of the moving object. The computer system utilizes this information to localize the vehicle within the dynamic environment and to control the configuration of the vehicle.
Description
- Not applicable.
- 1. Field of the Invention
- This invention relates generally to the field of vehicle navigation systems, and, more particularly, to vehicle navigation systems which can be utilized when lane line markings have become degraded, obscured, or are nonexistent.
- 2. Related Art
- Active safety and driver assist features such as lane departure warning, low-speed lane keeping (Traffic Jam Assist—TJA), high speed lane keeping (Highway Assist—HA) as well as fully autonomous vehicle operation rely upon localization of the vehicle within the lane to provide their functionality. Localization is defined as a computational problem of constructing or updating a map of an unknown environment while simultaneously keeping track of a vehicle's location within it. In general, each of these systems relies upon multiple sensor suites to provide robust and accurate positioning. Examples of currently relied upon sensor suites are: camera, stereo cameras, Global Positioning System (GPS), and LIDAR. However, in instances when lane lines become degraded, obscured, or are nonexistent, the camera and LIDAR based solutions are prone to failure. In addition, GPS on its own is not accurate enough for lane-level localization and is prone to dropping out as a result of urban or natural canyon scenarios.
- To facilitate the continued use of these active safety/driver assist features during occasions when positioning sensors operate in a high error or even failed state it is possible to dead reckon based on the last known position, as well as knowledge of the trajectory of the vehicle. It is possible to perform this reckoning using the vehicle on-board Inertial Measurement Unit (IMU) sensors, which are a suite of body fixed accelerometers and gyroscopes used to estimate vehicle velocity states. However, when integrating these signals in an effort to estimate position states, small bias errors can quickly accumulate into large position estimation errors.
- An improved reckoning of position state can be achieved via odometry from the vehicle's four wheel-speed sensors. While these sensors provide a robust estimate of longitudinal position, they are unable to accurately estimate lateral position changes of the vehicle. Thus, in the event of faulty or nonexistent lane level perception data, limited, if any, solutions exist for continued operation of the aforementioned active safety and vehicle assist features.
- The specific features, aspects and advantages of the present invention will become better understood with regard to the following description and accompanying drawings where:
-
FIG. 1 illustrates an example block diagram of a computing device. -
FIG. 2 illustrates an example computer architecture that facilitates tracking objects within a dynamic environment for improved localization. -
FIG. 3 illustrates a flow chart of an example method for tracking objects within a dynamic environment for improved localization. -
FIG. 4 illustrates another flow chart of an example method for tracking objects within a dynamic environment for improved localization. -
FIG. 5 illustrates an example position interface module. -
FIG. 6 illustrates a portion of a roadway. - The present invention extends to methods, systems, and computer program products for tracking objects within a dynamic environment for improved localization.
- Embodiments of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below. Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are computer storage media (devices). Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media (devices) and transmission media.
- Computer storage media (devices) includes RAM, ROM, EEPROM, CD-ROM, solid state drives (“SSDs”) (e.g., based on RAM), Flash memory, phase-change memory (“PCM”), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
- A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
- Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (devices) (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media (devices) at a computer system. RAM can also include solid state drives (SSDs or PCIx based real time memory tiered Storage, such as FusionIO). Thus, it should be understood that computer storage media (devices) can be included in computer system components that also (or even primarily) utilize transmission media.
- Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
- Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
- Embodiments of the invention can also be implemented in cloud computing environments. In this description and the following claims, “cloud computing” is defined as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned via virtualization and released with minimal management effort or service provider interaction, and then scaled accordingly. A cloud model can be composed of various characteristics (e.g., on-demand self-service, broad network access, resource pooling, rapid elasticity, measured service, etc.), service models (e.g., Software as a Service (SaaS), Platform as a Service (PaaS), Infrastructure as a Service (IaaS), and deployment models (e.g., private cloud, community cloud, public cloud, hybrid cloud, etc.). Databases and servers described with respect to the present invention can be included in a cloud model.
- Further, where appropriate, functions described herein can be performed in one or more of: hardware, software, firmware, digital components, or analog components. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein. Certain terms are used throughout the following description and Claims to refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function.
- In this description and the following claims, a “vehicle configuration” is defined as the configuration of a vehicle including one or more of: vehicle acceleration, vehicle velocity, vehicle position, and vehicle direction.
- In general, aspects of the invention are directed to tracking objects within a dynamic environment for improved localization. Sensing devices are utilized to gather data about a vehicle's environment. In cases where the sensor data has become degraded, such as data indicating that lane lines have become degraded, obscured, or are nonexistent, a vehicle computer system uses previously detected sensor data to estimate the speed and direction of travel of dynamic (e.g., moving) objects. The computer system then estimates the location of the dynamic objects after a specified period of time based on the estimated speed and direction of the dynamic objects. The computer system utilizes this information, as well as currently measured static objects, to localize the vehicle within the dynamic environment and to control the configuration of the vehicle.
-
FIG. 1 illustrates an example block diagram of acomputing device 100.Computing device 100 can be used to perform various procedures, such as those discussed herein.Computing device 100 can function as a server, a client, or any other computing entity.Computing device 100 can perform various communication and data transfer functions as described herein and can execute one or more application programs, such as the application programs described herein.Computing device 100 can be any of a wide variety of computing devices, such as a mobile telephone or other mobile device, a desktop computer, a notebook computer, a server computer, a handheld computer, tablet computer and the like. -
Computing device 100 includes one or more processor(s) 102, one or more memory device(s) 104, one or more interface(s) 106, one or more mass storage device(s) 108, one or more Input/Output (I/O) device(s) 110, and adisplay device 130 all of which are coupled to abus 112. Processor(s) 102 include one or more processors or controllers that execute instructions stored in memory device(s) 104 and/or mass storage device(s) 108. Processor(s) 102 may also include various types of computer storage media, such as cache memory. - Memory device(s) 104 include various computer storage media, such as volatile memory (e.g., random access memory (RAM) 114) and/or nonvolatile memory (e.g., read-only memory (ROM) 116). Memory device(s) 104 may also include rewritable ROM, such as Flash memory.
- Mass storage device(s) 108 include various computer storage media, such as magnetic tapes, magnetic disks, optical disks, solid state memory (e.g., Flash memory), and so forth. As depicted in
FIG. 1 , a particular mass storage device is ahard disk drive 124. Various drives may also be included in mass storage device(s) 108 to enable reading from and/or writing to the various computer readable media. Mass storage device(s) 108 includeremovable media 126 and/or non-removable media. - I/O device(s) 110 include various devices that allow data and/or other information to be input to or retrieved from
computing device 100. Example I/O device(s) 110 include cursor control devices, keyboards, keypads, barcode scanners, microphones, monitors or other display devices, speakers, printers, network interface cards, modems, cameras, lenses, CCDs or other image capture devices, and the like. -
Display device 130 includes any type of device capable of displaying information to one or more users ofcomputing device 100. Examples ofdisplay device 130 include a monitor, display terminal, video projection device, and the like. - Interface(s) 106 include various interfaces that allow
computing device 100 to interact with other systems, devices, or computing environments as well as humans. Example interface(s) 106 can include any number ofdifferent network interfaces 120, such as interfaces to personal area networks (PANs), local area networks (LANs), wide area networks (WANs), wireless networks (e.g., near field communication (NFC), Bluetooth, Wi-Fi, etc., networks), and the Internet. Other interfaces include user interface 118 andperipheral device interface 122. -
Bus 112 allows processor(s) 102, memory device(s) 104, interface(s) 106, mass storage device(s) 108, and I/O device(s) 110 to communicate with one another, as well as other devices or components coupled tobus 112.Bus 112 represents one or more of several types of bus structures, such as a system bus, PCI bus, IEEE 1394 bus, USB bus, and so forth. - In one aspect, a vehicle is outfitted with one or more radar systems. The one or more radar systems can be included in active safety and driver assist features such as lane departure warning, low-speed lane keeping (Traffic Jam Assist—TJA), high speed lane keeping (Highway Assist—HA), adaptive cruise control, etc. The one or more radar systems can be leveraged to improve position estimation and localization, supplement other perception sensor suites, and improve the robustness of active safety/driver assist systems as a whole.
- As such, distance measuring sensors such as radar/LIDAR can be used to aid in position estimation and localization. Measurements can be taken at discrete points in time and compared to one another. By comparing (e.g., two consecutive) scans, it is possible to estimate a vehicle's motion in time. Methods of comparison can include but are not limited to: iterative closest point (ICP). To better account for dynamic (e.g., moving) objects, evaluation of radar/LIDAR scans can: (1) estimate the speed and direction of travel of dynamic objects, (2) propagate these dynamic objects forward by the amount of time between two scans of the distance measuring sensors, and (3) consider propagated objects from the previous scan, as well as currently measured static objects, for the localization algorithm.
- These and other similar operations can be performed by an in-vehicle computer system to enable more robust position reckoning within a dynamic roadway environment. Sensor suites including but not limited to radar and ultrasonic sensors, as well as LIDAR and Camera sensors utilizing post processing techniques, can be used to estimate speed and direction of travel of dynamic (e.g., moving) objects. Algorithms including but not limited to: clustering techniques, nearest closest point methods, as well as Kalman filter techniques can be used to propagate dynamic objects between sensor scans. Accordingly, by leveraging the estimated trajectories of dynamic objects, the dynamic objects can be propagated forward in time, allowing for more accurate position reckoning for a vehicle.
-
FIG. 2 illustrates an example computer architecture in avehicle 200 that facilitates tracking objects within a dynamic environment for improved localization.Vehicle 200 can be a motorized vehicle, such as, for example, a car, a truck, a bus, or a motorcycle. Referring toFIG. 2 ,vehicle 200 includesvehicle computer system 201 andsensor devices 211. Each of thecomputer system 201 andsensor devices 211, as well as their respective components, can be connected to one another over (or be part of) a network, such as, for example, a PAN, a LAN, a WAN, a controller area network (CAN) bus or other in-vehicle bus, and even the Internet. Accordingly, each of thecomputer system 201 andsensor devices 211, as well as any other connected computer systems and their components, can create message related data and exchange message related data (e.g., near field communication (NFC) payloads, Bluetooth packets, Internet Protocol (IP) datagrams and other higher layer protocols that utilize IP datagrams, such as, Transmission Control Protocol (TCP), Hypertext Transfer Protocol (HTTP), Simple Mail Transfer Protocol (SMTP), etc.) over the network. -
Sensor devices 211 can include aradar system 212, an image-capture device 213, ainertial navigation system 214, amap 218, and aLIDAR system 219.Inertial navigation system 214 can further including a global positioning system (GPS) 215, an inertial measurement unit (IMU) 216, and a dead reckoning (DR)system 217. Other types of sensor devices (not shown), such as, for example, ultrasonic sensors and infrared sensors, can also be included insensor devices 211. Each of the sensor devices is configured to capture sensor data of a specified type by sensing objects in the vicinity ofvehicle 200. For example, image-capture device 213 is configured to capture image data,LIDAR system 219 is configured to capture LIDAR data, and so forth. Each of the respective types of sensor data can be transmitted tovehicle computer system 201. - As depicted,
vehicle computer system 201 includes aninterface position module 231 and acontrol system module 235.Interface position module 231 is configured to receive sensor data fromsensor devices 211.Interface position module 231 further includes a sensordata evaluation module 232, anestimation module 233, and alocalization module 234. - Sensor
data evaluation module 232 is configured to process and evaluate sensor data received fromsensor devices 211. For example, sensordata evaluation module 232 can process sensor data to identify road lane markings, objects in the vehicle environment, including static and dynamic (e.g., moving) objects, and information about the vehicle's configuration, including position on the road, trajectory, and velocity. Furthermore, sensordata evaluation module 232 can determine when road lane markings have become degraded, obscured, or are nonexistent. -
Estimation module 233 is configured to utilize sensor data to identify both static and dynamic objects in the vehicle environment. Accordingly, when sensordata evaluation module 232 determines that road lane markings are degraded, obscured, or are non-existent,estimation module 233 can utilize sensor data to identify static and dynamic objects in an area aroundvehicle 200. Furthermore,estimation module 233 is configured to estimate the speed and direction of travel of dynamic objects at different time steps. -
Localization module 234 is configured to utilize the results ofestimation module 233 to determine position reckoning for improved localization ofvehicle 200 within an environment containing other static and/or dynamic objects.Control system module 235 is configured to utilize the localization results ofposition interface module 231 to control the vehicle's configuration; including the vehicle's location, trajectory, and velocity. -
FIG. 3 illustrates a flow chart of anexample method 300 for tracking objects within a dynamic environment for improved localization.Method 300 will be described with respect to the components and data ofcomputer architecture 200. - As
vehicle 200 moves on a roadway, each of thesensing devices 211 can sense road marking information as well as static and dynamic objects in an environment aroundvehicle 200. For example, image-capture device 213 can capture image data of other objects within the environment,LIDAR system 219 can capture LIDAR data of other objects within the environment, and so forth. Each of the respective types of data can be combined insensor data 221. Thus,sensor data 221 can indicate the configuration of any static objects and/or dynamic objects within and/or around a portion of roadway wherevehicle 200 is traveling. - Static objects can include signs, posts, mile markers, street lights, trees, medians, guard rails, rocks, stationary (e.g., parked) vehicles, road construction equipment, etc. Dynamic objects can include other moving vehicles, pedestrians, cyclists, etc.
-
Sensor devices 211 can transmitsensor data 221 tovehicle computer system 201.Position interface module 231 can receivesensor data 221 fromsensor devices 211. -
Method 300 includes detecting that sensor data for objects within the dynamic environment has degraded, the sensor data having been gathered by a plurality of sensors at the vehicle, the sensor data indicating the configuration of objects within the dynamic environment, the objects including one or more static objects and one or more dynamic objects (301). For example, sensordata evaluation module 232 can detect thatsensor data 221 has degraded. Sensordata evaluation module 232 can process and evaluatesensor data 221 to identify road lane markings, static objects in the vehicle environment, dynamic objects in the vehicle environment, and information about the vehicle's configuration. From processing and evaluation, sensordata evaluation module 232 can determine when road lane markings have become degraded, obscured, or are non-existent (and thus may inhibit the operation of other automated systems ofvehicle 200, such as, for example, a lane assist system). - In response to detecting that the sensor data has become degraded, for each of the one or more dynamic objects,
method 300 includes estimating a speed and direction of travel for the dynamic object from previously detected sensor data (302). For example,estimation module 233 can utilizesensor data 221 to estimate the speed and direction of travel other dynamic objects within and/or around the portion of roadway wherevehicle 200 is traveling. - For any dynamic objects,
estimation module 233 can estimate the speed and direction of travel of the dynamic objects at different time steps. For example,estimation module 233 can utilize sensor data 221 (e.g., from distance measuring sensors such as radar and/or LIDAR) to aid in speed and direction estimation. To do this,estimation module 233 can compare sensor measurements taken at two discrete points in time.Estimation module 233 can compare the two consecutive scans to estimate speed and direction of travel for one or more dynamic objects. Methods of comparison can include iterative closest point (ICP) as well as other algorithms. - In response to determining that the sensor data has become degraded, for each of the one or more dynamic objects,
method 300 includes estimating the location of the dynamic object after a specified period of time based on the estimated speed and direction of the dynamic object (303). For example,estimation module 233 can estimate the location of one or more dynamic objects within and/or around the portion of roadway wherevehicle 200 is traveling after a specified period of time. Estimated locations for each dynamic object can be calculated based on estimated speed and direction for the dynamic object. - In response to determining that the sensor data has become degraded,
method 300 includes localizing the vehicle within the dynamic environment based on the estimated positions for the one or more moving objects and the positions of the one or more static objects (304). For example,localization module 234 can utilize the results fromestimation module 233 to localizevehicle 200 within and/or around the portion of roadway wherevehicle 200 is traveling. Localization can be based on estimated positions for other dynamic objects and/or other static objects within and/or around the portion of roadway wherevehicle 200 is traveling. -
Position interface module 231 can send the localization ofvehicle 200 to controlsystem module 235.Control system module 235 can receive the localization ofvehicle 200 fromposition interface module 231. - In one aspect,
position interface module 231 essentially creates a map of a dynamic environment, such as, for example, other dynamic objects and/or other static objects within and/or around the portion of roadway wherevehicle 200 is traveling. The map can be based on one or more of a lane marking on the road, a geographic location of the vehicle, and a predetermined map of the road. A new position for dynamic objects (e.g., other moving vehicles) can also be calculated based on an initial position and an initial velocity of the dynamic objects and based on a specified period of time. -
Method 300 includes using the localization to control the configuration of the vehicle within the dynamic environment (305). For example,control system module 235 can use the location ofvehicle 200 to control the configuration ofvehicle 200 within and/or around the portion of roadway wherevehicle 200 is traveling. Controlling the configuration ofvehicle 200 can include accelerating, decelerating, maintaining speed, changing direction, maintaining direction, braking, etc.Control system module 235 can control other vehicle systems, such as, for example, cruise control, to control the configuration ofvehicle 200. - Accordingly,
position interface module 231 can: (1) estimate the speed and direction of travel of dynamic objects, (2) calculate the predicted location of the dynamic objects by propagating the dynamic objects between sensor scans, and (3) utilize the predicted locations of dynamic objects as well as locations static objects to compensate for degraded, obscured, or nonexistent lane markings. As such, aspects of the invention include robust position reckoning within a dynamic environment in where a vehicle is operating. -
FIG. 4 illustrates anotherflow chart 400 of an example method for tracking objects within a dynamic environment for improved localization. Various components invehicle 200 can interoperate to implementmethod 400. -
Method 400 includes measuring distance data (401). For example, one or more ofsensors 211 can measure distances to other objects within and/or around the portion of roadway wherevehicle 200 is traveling. Sensor data from the one or more sensors can be combined insensor data 221.Method 400 includes receiving and processing sensor hit data from objects (402). For example, sensordata evaluation module 232 can receive andprocess sensor data 221. -
Method 400 includes detecting clustered objects (403). For example, based onsensor data 221,estimation module 233 can detect clusters of static and/or dynamic objects within and/or around the portion of roadway wherevehicle 200 is traveling.Method 400 includes evaluating if an object is dynamic or static (404). For example, for each object in a cluster,estimation module 233 can determine if the object is static or dynamic.Method 400 includes estimating speed and direction for dynamic objects (405). For example, for each dynamic object within and/or around the portion of roadway wherevehicle 200 is traveling,estimation module 233 can estimate the speed and direction for the dynamic object.Method 400 includes adding a predicted location for the dynamic object to list of static objects (406). For example,estimation module 233 can add a predicted location for each dynamic object to a list of locations for static objects. As such, for a specified future time, the location of any objects within and/or around the portion of roadway wherevehicle 200 is traveling can be estimated. -
Method 400 includes utilizing a localization algorithm (407). For example,localization module 234 can utilize a localization algorithm to localizevehicle 200 within and/or around the portion of roadway wherevehicle 200 is traveling.Vehicle 200 can be localized based on estimated locations for any objects within and/or around the portion of roadway wherevehicle 200 is traveling at the specified future time. -
FIG. 5 illustrates an example position interface module 501. Position interface module 501 can receive sensor data from a variety of different vehicle sensors at a vehicle, including any of: aradar 511, acamera 512, INS (GPS+IMU+DR) 513, amap drive history 514, and aLIDAR 515. Position interface module 501 can use a sensor fusion algorithm to localize the vehicle in an environment based on received sensor data. Localization of the vehicle can be represented by alane level localization 521, aconfidence level 522, and a fault status 523.Lane level localization 521 can localize the vehicle to a specified roadway lane within some margin of error (e.g., 0.5 m−2 m).Confidence interval 522 can indicate how confident position interface module 501 is inlane level localization 521. Fault status 523 can indicate if position interface module 501 experienced a fault during determination oflane level localization 521. -
FIG. 6 illustrates a portion of aroadway 602. As depicted,roadway 602 includeslanes Roadway 602 includes a number of staticobjects including trees signs vehicle 672.Roadway 602 also includes movingvehicles vehicles vehicle 650. On the other hand,vehicles 662 and 663 are traveling in essentially the opposite direction ofvehicle 650.Roadway 602 also includescross-walk 676. -
Vehicle 650 includes a variety of sensors including animage capture device 651, aLIDAR system 652, aradar system 653, amap 654, aGPS 655, and an Inertial Measurement Unit (IMU) 656.Vehicle 650 can also include a computer system similar tovehicle computer system 201 and/or a position interface module similar toposition interface module 231 and/or position interface module 501. - As
vehicle 650 moves withinlane 603, the sensors can detect the other static objects and the other dynamic objects onroadway 602. The sensors can also detect the lane markings forlane 602 includinglane markings vehicle 650 proceeds, lane markings forlane 602 can degrade and become less visible as indicated bylane markings - In response to degraded lane markings,
vehicle 650 can predict the location ofvehicles roadway 602 at a future point in time.Vehicle 650 can combine the predicted future locations ofvehicles trees signs vehicle 670 to estimate the environment ofroadway 602 at the future point in time.Vehicle 650 can use the estimated environment to compensate for the degradation oflane lines vehicle 650 in a safe configuration. - For example, if
vehicle 650 predicts thatvehicle 660 is essentially at the same distance straight in front ofvehicle 650 at the future point time,vehicle 650 has some level of confidence that it can safely remain inlane 603 if maintaining a current configuration. Ifvehicle 650 predicts thatvehicle 660 is at a lesser distance straight in front ofvehicle 650 at the future point time,vehicle 650 has some level of confidence that it can safely remain inlane 603 if it reduces speed. Depending on other predicted future locations of dynamic and static objects inroadway 602,vehicle 650 can change to a safe configuration in other ways, such as, for example, changing direction, accelerating, coming to a complete stop, etc. - Although the components and modules illustrated herein are shown and described in a particular arrangement, the arrangement of components and modules may be altered to process data in a different manner. In other embodiments, one or more additional components or modules may be added to the described systems, and one or more components or modules may be removed from the described systems. Alternate embodiments may combine two or more of the described components or modules into a single component or module.
- The foregoing description has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. Further, it should be noted that any or all of the aforementioned alternate embodiments may be used in any combination desired to form additional hybrid embodiments of the invention.
- Further, although specific embodiments of the invention have been described and illustrated, the invention is not to be limited to the specific forms or arrangements of parts so described and illustrated. The scope of the invention is to be defined by the claims appended hereto, any future claims submitted here and in different applications, and their equivalents.
Claims (24)
1. A method for tracking objects within a dynamic environment for improved localization, the method comprising:
detecting that sensor data for objects within the dynamic environment has degraded;
estimating a speed and direction of travel for moving objects from previously detected sensor data;
estimating a location of moving objects after a specified period of time, including for each moving object calculating a new position of the moving object based on an initial position and an initial velocity of the moving object, and the specified period of time:
localizing a vehicle within the dynamic environment; and
using the localization to control a configuration of the vehicle.
2. The method of claim 1 , wherein detecting that sensor data for objects within the dynamic environment has degraded comprises detecting that lane lines on a roadway have become one or more of: degraded, obscured or nonexistent.
3. The method of claim 1 , wherein using the localization to control a configuration of the vehicle comprises using localization to control one or more of: acceleration, speed, or direction for the vehicle.
4. The method of claim 1 , wherein localizing a vehicle within the dynamic environment comprises localizing the vehicle within the dynamic environment within a specified confidence interval.
5. The method of claim 1 , wherein localizing the vehicle within the dynamic environment comprises calculating the configuration of the vehicle to maintain safe operation of the vehicle.
6. The method of claim 1 , wherein using the localization to control the configuration of the vehicle comprises utilizing the vehicle control system to place the vehicle in a safe configuration.
7. A method for use at a vehicle computer system, the computer system including one or more processors and system memory, the method for tracking objects within a dynamic environment for improved localization, the method comprising the processor:
detecting that sensor data for objects within the dynamic environment has degraded, the sensor data having been gathered by a plurality of sensors at the vehicle, the sensor data indicating the configuration of objects within the dynamic environment, the objects including one or more static objects and one or more dynamic objects;
in response to detecting that the sensor data has become degraded, for each of the one or more dynamic objects:
estimating a speed and direction of travel for the dynamic object from previously detected sensor data; and
estimating a location of the dynamic object after a specified period of time, including calculating a new position for the dynamic object based on an initial position and an initial velocity of the dynamic object, and the specified period of time;
using a localization to control a configuration of the vehicle within the dynamic environment.
8. The method of claim 7 , further comprising obtaining the sensor data from one or more sensors, the one or more sensors selected from among: a camera, a global positioning systems (GPS), a LIDAR, a radar, an ultrasonic sensor, an infrared sensor, and a inertial measurement unit (IMU).
9. The method of claim 7 , wherein detecting that sensor data for objects within the dynamic environment has degraded comprises detecting that lane lines on a roadway have become one or more of: degraded, obscured, or nonexistent.
10. The method of claim 7 , wherein using a localization to control the configuration of the vehicle within the dynamic environment comprises localizing the vehicle within the dynamic environment within a specified confidence interval.
11. The method of claim 7 , wherein using a localization to control the configuration of the vehicle within the dynamic environment comprises indicating that there was a fault in using the localization to control the configuration of the vehicle.
12. The method of claim 7 , wherein estimating a speed and direction of travel for the dynamic object from previously detected sensor data comprises identifying an object that is traveling in essentially the same direction as the vehicle.
13. The method of claim 7 , wherein estimating a speed and direction of travel for the dynamic object from previously detected sensor data comprises identifying an object that is traveling in essentially the opposite direction as the vehicle.
14. (canceled)
15. The method of claim 7 , further comprising, creating a map of the dynamic environment of the vehicle wherein the map is based on at least one of: a lane marking on the road, a geographic location of the vehicle, and a predetermined map of the road.
16. (canceled)
17. The method of claim 15 , further comprising localizing the vehicle within the dynamic environment based on the estimated locations for the one or more dynamic objects, including calculating a configuration for the vehicle to maintain safe autonomous operations in a roadway environment.
18. (canceled)
19. (canceled)
20. The computer system of claim 21 , further comprising the one or more processors executing the instructions stored in the system memory to obtain the sensor data for the objects within the dynamic environment from one or more sensors, the one or more sensors selected from among: a camera, a global positioning systems (GPS), a LIDAR, a radar, an ultrasonic sensor, an infrared sensor, and a inertial measurement unit (IMU).
21. A computer system, the computer system comprising:
one or more processors;
system memory coupled to the one or more processors, the system memory storing instructions that are executable by the one or more processors; and
the one or more processors executing the instructions stored in the system memory to track objects within a dynamic environment, including the following:
detect that sensor data for objects within the dynamic environment has degraded;
estimate a speed and direction of travel for moving objects from previously detected sensor data;
estimate a location of moving objects after a specified period of time;
localize a vehicle within the dynamic environment within a specified confidence interval; and
use the localization to control a configuration of the vehicle.
22. The computer system of claim 21 , wherein the one or more processors executing the instructions stored in the system memory to use the localization to control a configuration of the vehicle comprises the one or more processors executing the instructions stored in the system memory to indicate that there was a fault in using the localization to control the configuration of the vehicle within the dynamic environment.
23. The computer system of claim 21 , wherein the one or more processors executing the instructions stored in the system memory to use the localization to estimate a speed and direction of travel for moving objects from previously detected sensor data comprises the one or more processors executing the instructions stored in the system memory to identify an object that is traveling in essentially the opposite direction as the vehicle.
24. The computer system of claim 21 , wherein the one or more processors executing the instructions stored in the system memory to estimate a location of moving objects after a specified period of time comprises the one or more processors executing the instructions stored in the system memory to, for each moving object, calculate a new position of the moving object based on an initial position and an initial velocity of the moving object, and the specified period of time.
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/010,303 US9707961B1 (en) | 2016-01-29 | 2016-01-29 | Tracking objects within a dynamic environment for improved localization |
CN201710046746.XA CN107024215B (en) | 2016-01-29 | 2017-01-22 | Tracking objects within a dynamic environment to improve localization |
DE102017101466.7A DE102017101466A1 (en) | 2016-01-29 | 2017-01-25 | TRACKING OBJECTS IN A DYNAMIC ENVIRONMENT FOR IMPROVED LOCALIZATION |
RU2017102674A RU2017102674A (en) | 2016-01-29 | 2017-01-27 | MONITORING OBJECTS IN A DYNAMIC ENVIRONMENT FOR IMPROVED LOCALIZATION |
GB1701393.9A GB2547999A (en) | 2016-01-29 | 2017-01-27 | Tracking objects within a dynamic environment for improved localization |
MX2017001355A MX2017001355A (en) | 2016-01-29 | 2017-01-30 | Tracking objects within a dynamic environment for improved localization. |
US15/487,184 US10077054B2 (en) | 2016-01-29 | 2017-04-13 | Tracking objects within a dynamic environment for improved localization |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/010,303 US9707961B1 (en) | 2016-01-29 | 2016-01-29 | Tracking objects within a dynamic environment for improved localization |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/487,184 Continuation US10077054B2 (en) | 2016-01-29 | 2017-04-13 | Tracking objects within a dynamic environment for improved localization |
Publications (2)
Publication Number | Publication Date |
---|---|
US9707961B1 US9707961B1 (en) | 2017-07-18 |
US20170217433A1 true US20170217433A1 (en) | 2017-08-03 |
Family
ID=58462592
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/010,303 Active US9707961B1 (en) | 2016-01-29 | 2016-01-29 | Tracking objects within a dynamic environment for improved localization |
US15/487,184 Active US10077054B2 (en) | 2016-01-29 | 2017-04-13 | Tracking objects within a dynamic environment for improved localization |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/487,184 Active US10077054B2 (en) | 2016-01-29 | 2017-04-13 | Tracking objects within a dynamic environment for improved localization |
Country Status (6)
Country | Link |
---|---|
US (2) | US9707961B1 (en) |
CN (1) | CN107024215B (en) |
DE (1) | DE102017101466A1 (en) |
GB (1) | GB2547999A (en) |
MX (1) | MX2017001355A (en) |
RU (1) | RU2017102674A (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180170371A1 (en) * | 2016-12-21 | 2018-06-21 | Toyota Jidosha Kabushiki Kaisha | Driving supporter |
US10151843B2 (en) * | 2011-11-22 | 2018-12-11 | Radio Systems Corporation | Systems and methods of tracking position and speed in GNSS applications |
US10972456B2 (en) | 2016-11-04 | 2021-04-06 | Microsoft Technology Licensing, Llc | IoT device authentication |
CN113383586A (en) * | 2019-02-02 | 2021-09-10 | 索尼集团公司 | Apparatus, method and storage medium for wireless communication system |
EP4083959A1 (en) * | 2021-04-26 | 2022-11-02 | Nio Technology (Anhui) Co., Ltd | Traffic flow machine-learning modeling system and method applied to vehicles |
US11514158B2 (en) | 2016-11-04 | 2022-11-29 | Microsoft Technology Licensing, Llc | IoT security service |
US20230065284A1 (en) * | 2021-09-01 | 2023-03-02 | Baidu Usa Llc | Control and planning with localization uncertainty |
US11727794B2 (en) | 2018-03-14 | 2023-08-15 | Micron Technology, Inc. | Systems and methods for evaluating and sharing human driving style information with proximate vehicles |
US11861913B2 (en) | 2018-04-11 | 2024-01-02 | Lodestar Licensing Group Llc | Determining autonomous vehicle status based on mapping of crowdsourced object data |
US11866020B2 (en) | 2018-06-15 | 2024-01-09 | Lodestar Licensing Group Llc | Detecting road conditions based on braking event data received from vehicles |
Families Citing this family (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10545229B2 (en) * | 2016-04-22 | 2020-01-28 | Huawei Technologies Co., Ltd. | Systems and methods for unified mapping of an environment |
DE102016213782A1 (en) * | 2016-07-27 | 2018-02-01 | Volkswagen Aktiengesellschaft | A method, apparatus and computer readable storage medium having instructions for determining the lateral position of a vehicle relative to the lanes of a lane |
KR20180051836A (en) * | 2016-11-09 | 2018-05-17 | 삼성전자주식회사 | Generating method and apparatus of virtual driving lane for driving car |
US10940858B2 (en) * | 2016-11-29 | 2021-03-09 | Esperanto Sensors LLC | Multi-mode collision avoidance system |
EP3551966B1 (en) * | 2016-12-06 | 2021-03-03 | Nissan North America, Inc. | Solution path overlay interfaces for autonomous vehicles |
US10216189B1 (en) | 2017-08-23 | 2019-02-26 | Uber Technologies, Inc. | Systems and methods for prioritizing object prediction for autonomous vehicles |
US11049393B2 (en) * | 2017-10-13 | 2021-06-29 | Robert Bosch Gmbh | Systems and methods for vehicle to improve an orientation estimation of a traffic participant |
CN111279150B (en) * | 2017-10-31 | 2023-06-16 | 三菱电机株式会社 | Map information management device, map information management system, and map information management method |
US10535138B2 (en) | 2017-11-21 | 2020-01-14 | Zoox, Inc. | Sensor data segmentation |
CN108345020B (en) * | 2018-02-09 | 2020-08-18 | 长沙智能驾驶研究院有限公司 | Vehicle positioning method, system and computer readable storage medium |
US11157527B2 (en) * | 2018-02-20 | 2021-10-26 | Zoox, Inc. | Creating clean maps including semantic information |
US10852146B2 (en) | 2018-02-28 | 2020-12-01 | Ford Global Technologies, Llc | Localization technique selection |
CN108873885B (en) * | 2018-04-25 | 2021-12-10 | 珠海市杰理科技股份有限公司 | Vehicle control method, device and system |
KR102420568B1 (en) * | 2018-04-27 | 2022-07-13 | 삼성전자주식회사 | Method for determining a position of a vehicle and vehicle thereof |
EP3618024A1 (en) * | 2018-07-13 | 2020-03-04 | Sony Mobile Communications Inc. | Methods, devices and computer program products for tracking of objects in a transportion system |
DE102018118220B4 (en) * | 2018-07-27 | 2020-04-16 | Man Truck & Bus Se | Method for estimating the localization quality in the self-localization of a vehicle, device for carrying out method steps of the method, vehicle and computer program |
GB2576206B (en) * | 2018-08-10 | 2021-01-06 | Jaguar Land Rover Ltd | Sensor degradation |
US11399137B2 (en) * | 2018-08-10 | 2022-07-26 | Aurora Flight Sciences Corporation | Object-tracking system |
US20200070822A1 (en) * | 2018-09-04 | 2020-03-05 | GM Global Technology Operations LLC | Systems and methods for predicting object behavior |
KR102061867B1 (en) * | 2018-09-10 | 2020-01-02 | 한성욱 | Apparatus for generating image and method thereof |
EP3644016A1 (en) | 2018-10-23 | 2020-04-29 | Zenuity AB | Localization using dynamic landmarks |
US11105927B2 (en) * | 2018-11-29 | 2021-08-31 | Waymo Llc | Localization initialization for autonomous vehicles |
US10922840B2 (en) * | 2018-12-20 | 2021-02-16 | Here Global B.V. | Method and apparatus for localization of position data |
US11782158B2 (en) * | 2018-12-21 | 2023-10-10 | Waymo Llc | Multi-stage object heading estimation |
DE102019102919A1 (en) * | 2019-02-06 | 2020-08-06 | Bayerische Motoren Werke Aktiengesellschaft | Method, device, computer program and computer program product for operating a vehicle |
CN111133480B (en) * | 2019-02-14 | 2021-12-24 | 深圳市中联创新自控系统有限公司 | Moving ring equipment inspection method, moving ring platform and moving ring equipment inspection system |
US11112252B2 (en) * | 2019-02-14 | 2021-09-07 | Hitachi Ltd. | Sensor fusion for accurate localization |
CN109795416A (en) * | 2019-03-18 | 2019-05-24 | 重庆睿驰智能科技有限公司 | Vehicle pavement identifies blind area automated driving system |
EP3770881B1 (en) * | 2019-07-26 | 2023-11-15 | Volkswagen AG | Methods, computer programs, apparatuses, a vehicle, and a traffic entity for updating an environmental model of a vehicle |
US11347231B2 (en) * | 2019-08-07 | 2022-05-31 | Waymo Llc | Object localization for autonomous driving by visual tracking and image reprojection |
DE102020120873A1 (en) * | 2019-08-12 | 2021-02-18 | Motional AD LLC (n.d.Ges.d. Staates Delaware) | LOCALIZATION BASED ON PRE-DEFINED CHARACTERISTICS OF THE SURROUNDING AREA |
US11435439B2 (en) * | 2019-08-19 | 2022-09-06 | Waymo Llc | Multibounce target mitigation |
TWI728470B (en) * | 2019-09-18 | 2021-05-21 | 財團法人車輛研究測試中心 | Target intention predicting method and system thereof |
CN110834644B (en) * | 2019-10-30 | 2021-01-19 | 中国第一汽车股份有限公司 | Vehicle control method and device, vehicle to be controlled and storage medium |
US11604288B2 (en) | 2019-11-08 | 2023-03-14 | Ford Global Technologies, Llc | Self-correcting vehicle localization |
CN112693466A (en) * | 2021-01-29 | 2021-04-23 | 重庆长安汽车股份有限公司 | System and method for evaluating performance of vehicle environment perception sensor |
DE102021209783A1 (en) | 2021-09-06 | 2023-03-09 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method for providing data to create a digital map |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE59809476D1 (en) * | 1997-11-03 | 2003-10-09 | Volkswagen Ag | Autonomous vehicle and method for controlling an autonomous vehicle |
TW577975B (en) * | 2000-07-25 | 2004-03-01 | American Gnc Corp | Core inertial measurement unit |
US20080065328A1 (en) | 2006-09-08 | 2008-03-13 | Andreas Eidehall | Method and system for collision avoidance |
US8532862B2 (en) | 2006-11-29 | 2013-09-10 | Ryan A. Neff | Driverless vehicle |
EP2168079B1 (en) * | 2007-01-23 | 2015-01-14 | Valeo Schalter und Sensoren GmbH | Method and system for universal lane boundary detection |
US8355539B2 (en) | 2007-09-07 | 2013-01-15 | Sri International | Radar guided vision system for vehicle validation and vehicle motion characterization |
JP4561863B2 (en) | 2008-04-07 | 2010-10-13 | トヨタ自動車株式会社 | Mobile body path estimation device |
US8126642B2 (en) * | 2008-10-24 | 2012-02-28 | Gray & Company, Inc. | Control and systems for autonomously driven vehicles |
JP4614005B2 (en) | 2009-02-27 | 2011-01-19 | トヨタ自動車株式会社 | Moving locus generator |
JP5482320B2 (en) * | 2010-03-11 | 2014-05-07 | 株式会社デンソー | Vehicle driving support device |
US8260539B2 (en) * | 2010-05-12 | 2012-09-04 | GM Global Technology Operations LLC | Object and vehicle detection and tracking using 3-D laser rangefinder |
JP2012212271A (en) * | 2011-03-31 | 2012-11-01 | Toyota Motor Corp | Driving support device |
US9562778B2 (en) * | 2011-06-03 | 2017-02-07 | Robert Bosch Gmbh | Combined radar and GPS localization system |
WO2012167301A1 (en) | 2011-06-10 | 2012-12-13 | Navisens Pty Ltd | Positioning, tracking and trajectory estimation of a mobile object |
EP2720460B1 (en) * | 2011-06-13 | 2017-01-04 | Honda Motor Co., Ltd. | Driving assistance device |
JP5796187B2 (en) * | 2011-10-31 | 2015-10-21 | パナソニックIpマネジメント株式会社 | Evaluation value calculation apparatus and evaluation value calculation method |
US8935057B2 (en) * | 2012-01-17 | 2015-01-13 | LimnTech LLC | Roadway mark data acquisition and analysis apparatus, systems, and methods |
KR101372023B1 (en) * | 2012-05-31 | 2014-03-07 | 현대자동차주식회사 | Apparatus and method for detecting moving-object of surrounding of vehicle |
US8706417B2 (en) * | 2012-07-30 | 2014-04-22 | GM Global Technology Operations LLC | Anchor lane selection method using navigation input in road change scenarios |
US8972093B2 (en) * | 2013-04-08 | 2015-03-03 | Toyota Motor Engineering & Manufacturing North America, Inc. | Lane-based localization |
EP2793041A1 (en) * | 2013-04-15 | 2014-10-22 | Nederlandse Organisatie voor toegepast -natuurwetenschappelijk onderzoek TNO | Assured vehicle absolute localisation |
DE102013214308A1 (en) * | 2013-07-22 | 2015-01-22 | Robert Bosch Gmbh | Distance controller for motor vehicles |
US9067671B2 (en) * | 2013-07-25 | 2015-06-30 | Disney Enterprises, Inc. | Visual localization of unmanned aerial vehicles based on marker detection and processing |
KR20150055271A (en) | 2013-11-13 | 2015-05-21 | 현대모비스 주식회사 | Apparatus for determining motion characteristics of target and device for controlling driving route of vehicle with the said apparatus |
JP6340812B2 (en) * | 2014-02-18 | 2018-06-13 | 村田機械株式会社 | Autonomous vehicle |
DE102014208009A1 (en) * | 2014-04-29 | 2015-10-29 | Bayerische Motoren Werke Aktiengesellschaft | Capture static and dynamic objects |
JP2016090274A (en) * | 2014-10-30 | 2016-05-23 | トヨタ自動車株式会社 | Alarm apparatus, alarm system, and portable terminal |
CN105069859B (en) * | 2015-07-24 | 2018-01-30 | 深圳市佳信捷技术股份有限公司 | Vehicle running state monitoring method and device |
US9804599B2 (en) * | 2015-11-04 | 2017-10-31 | Zoox, Inc. | Active lighting control for communicating a state of an autonomous vehicle to entities in a surrounding environment |
US10745003B2 (en) * | 2015-11-04 | 2020-08-18 | Zoox, Inc. | Resilient safety system for a robotic vehicle |
US9786171B2 (en) * | 2016-01-26 | 2017-10-10 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for detecting and distributing hazard data by a vehicle |
-
2016
- 2016-01-29 US US15/010,303 patent/US9707961B1/en active Active
-
2017
- 2017-01-22 CN CN201710046746.XA patent/CN107024215B/en active Active
- 2017-01-25 DE DE102017101466.7A patent/DE102017101466A1/en active Pending
- 2017-01-27 RU RU2017102674A patent/RU2017102674A/en not_active Application Discontinuation
- 2017-01-27 GB GB1701393.9A patent/GB2547999A/en not_active Withdrawn
- 2017-01-30 MX MX2017001355A patent/MX2017001355A/en unknown
- 2017-04-13 US US15/487,184 patent/US10077054B2/en active Active
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10151843B2 (en) * | 2011-11-22 | 2018-12-11 | Radio Systems Corporation | Systems and methods of tracking position and speed in GNSS applications |
US11514158B2 (en) | 2016-11-04 | 2022-11-29 | Microsoft Technology Licensing, Llc | IoT security service |
US10972456B2 (en) | 2016-11-04 | 2021-04-06 | Microsoft Technology Licensing, Llc | IoT device authentication |
US11639171B2 (en) | 2016-12-21 | 2023-05-02 | Toyota Jidosha Kabushiki Kaisha | Lane keeping system responsive to steering input |
US10967854B2 (en) * | 2016-12-21 | 2021-04-06 | Toyota Jidosha Kabushiki Kaisha | Lane keeping system responsive to steering input |
US20180170371A1 (en) * | 2016-12-21 | 2018-06-21 | Toyota Jidosha Kabushiki Kaisha | Driving supporter |
US11727794B2 (en) | 2018-03-14 | 2023-08-15 | Micron Technology, Inc. | Systems and methods for evaluating and sharing human driving style information with proximate vehicles |
US11861913B2 (en) | 2018-04-11 | 2024-01-02 | Lodestar Licensing Group Llc | Determining autonomous vehicle status based on mapping of crowdsourced object data |
US11866020B2 (en) | 2018-06-15 | 2024-01-09 | Lodestar Licensing Group Llc | Detecting road conditions based on braking event data received from vehicles |
CN113383586A (en) * | 2019-02-02 | 2021-09-10 | 索尼集团公司 | Apparatus, method and storage medium for wireless communication system |
US20220060854A1 (en) * | 2019-02-02 | 2022-02-24 | Sony Group Corporation | Device, method for wireless communication system, and storage medium |
EP4083959A1 (en) * | 2021-04-26 | 2022-11-02 | Nio Technology (Anhui) Co., Ltd | Traffic flow machine-learning modeling system and method applied to vehicles |
US20230065284A1 (en) * | 2021-09-01 | 2023-03-02 | Baidu Usa Llc | Control and planning with localization uncertainty |
Also Published As
Publication number | Publication date |
---|---|
CN107024215B (en) | 2022-06-07 |
GB201701393D0 (en) | 2017-03-15 |
US20170217434A1 (en) | 2017-08-03 |
MX2017001355A (en) | 2018-07-30 |
US10077054B2 (en) | 2018-09-18 |
RU2017102674A (en) | 2018-08-02 |
CN107024215A (en) | 2017-08-08 |
US9707961B1 (en) | 2017-07-18 |
GB2547999A (en) | 2017-09-06 |
DE102017101466A1 (en) | 2017-08-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10077054B2 (en) | Tracking objects within a dynamic environment for improved localization | |
US10810872B2 (en) | Use sub-system of autonomous driving vehicles (ADV) for police car patrol | |
US10816973B2 (en) | Utilizing rule-based and model-based decision systems for autonomous driving control | |
US11545033B2 (en) | Evaluation framework for predicted trajectories in autonomous driving vehicle traffic prediction | |
US10816979B2 (en) | Image data acquisition logic of an autonomous driving vehicle for capturing image data using cameras | |
US10990855B2 (en) | Detecting adversarial samples by a vision based perception system | |
JPWO2018235239A1 (en) | Vehicle information storage method, vehicle travel control method, and vehicle information storage device | |
US20210389133A1 (en) | Systems and methods for deriving path-prior data using collected trajectories | |
US10782411B2 (en) | Vehicle pose system | |
US20210403001A1 (en) | Systems and methods for generating lane data using vehicle trajectory sampling | |
US20220194412A1 (en) | Validating Vehicle Sensor Calibration | |
US11408739B2 (en) | Location correction utilizing vehicle communication networks | |
US11668573B2 (en) | Map selection for vehicle pose system | |
US11456890B2 (en) | Open and safe monitoring system for autonomous driving platform | |
US20220028262A1 (en) | Systems and methods for generating source-agnostic trajectories | |
JP6903598B2 (en) | Information processing equipment, information processing methods, information processing programs, and mobiles | |
WO2020113038A1 (en) | Tuning autonomous vehicle dispatch using autonomous vehicle performance | |
EP3339808B1 (en) | Positioning objects in an augmented reality display | |
US20210048819A1 (en) | Apparatus and method for determining junction | |
US11128981B2 (en) | Cellular network delivery of travel safety alerts | |
US20190279504A1 (en) | Method, device and system for wrong-way driver detection | |
US10532750B2 (en) | Method, device and system for wrong-way driver detection | |
US20220198714A1 (en) | Camera to camera calibration | |
US20200402396A1 (en) | Method, device and system for wrong-way driver detection | |
US20210405641A1 (en) | Detecting positioning of a sensor system associated with a vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FORD GLOBAL TECHNOLOGIES, LLC, MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HALDER, BIBHRAJIT;VARNHAGEN, SCOTT;SIGNING DATES FROM 20160111 TO 20160112;REEL/FRAME:037620/0027 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |