US20170139417A1 - Distinguishing Lane Markings for a Vehicle to Follow - Google Patents
Distinguishing Lane Markings for a Vehicle to Follow Download PDFInfo
- Publication number
- US20170139417A1 US20170139417A1 US14/943,573 US201514943573A US2017139417A1 US 20170139417 A1 US20170139417 A1 US 20170139417A1 US 201514943573 A US201514943573 A US 201514943573A US 2017139417 A1 US2017139417 A1 US 2017139417A1
- Authority
- US
- United States
- Prior art keywords
- vehicle
- road
- markings
- sensor data
- surface markings
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims abstract description 37
- 238000012545 processing Methods 0.000 claims description 15
- 238000002310 reflectometry Methods 0.000 claims description 12
- 238000005336 cracking Methods 0.000 claims description 8
- 239000003973 paint Substances 0.000 claims description 8
- 230000004044 response Effects 0.000 claims description 6
- 230000008569 process Effects 0.000 claims description 5
- 230000000007 visual effect Effects 0.000 claims 4
- 238000004590 computer program Methods 0.000 abstract description 2
- 239000007787 solid Substances 0.000 description 36
- 230000005540 biological transmission Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 4
- 238000010276 construction Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000012423 maintenance Methods 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000013403 standard screening design Methods 0.000 description 2
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000001444 catalytic combustion detection Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010422 painting Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000007790 scraping Methods 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
- B60W30/12—Lane keeping
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
-
- G06K9/00798—
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/53—Road markings, e.g. lane marker or crosswalk
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/404—Characteristics
- B60W2554/4041—Position
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2710/00—Output or target parameters relating to a particular sub-units
- B60W2710/20—Steering systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2300/00—Purposes or special features of road vehicle drive control systems
- B60Y2300/10—Path keeping
- B60Y2300/12—Lane keeping
Definitions
- This invention relates generally to the field of road line detection systems, and, more particularly, to road line detection systems which can distinguish lane markings for a vehicle to follow.
- Benefits exist as the result of vehicles becoming more automated or completely autonomous. Benefits include reduced traffic accidents, improved traffic flow, and reduced fuel consumption, just to name a few. For example, many vehicles are utilizing collision avoidance technology where the vehicle's computer system can apply the brakes if the vehicle's sensors detect that the vehicle is in danger of colliding with another vehicle or object. Also, many vehicles have introduced a parallel parking feature which allows the vehicle to parallel park without any input from the driver of the vehicle.
- the plausibility of automated or autonomous vehicles is heavily dependent on the ability of the vehicle sensors and the vehicle computer system to identify and process the vehicle's environment and to react to different situations in a safe and efficient manner.
- the vehicle environment is very dynamic.
- the vehicle computer system needs to be able to perform in varying weather conditions, such as, rain or snow, and react to various road conditions, such as, icy roads or roads that are under construction.
- the challenge exists to guarantee a safe and efficient driving experience, regardless of the vehicle environment.
- lane markings can be used as a means to safely guide the vehicle along its route.
- road surface markings not only include lane markings, but they also include navigation information, such as, turning lane indicators, speed limit information, crosswalk information, railroad information, and high-occupancy vehicle (HOV) markings. Distinguishing between lane markings and other road surface markings can be difficult.
- navigation information such as, turning lane indicators, speed limit information, crosswalk information, railroad information, and high-occupancy vehicle (HOV) markings. Distinguishing between lane markings and other road surface markings can be difficult.
- HOV high-occupancy vehicle
- Lanes can be shifted to one side or another during periods of construction to facilitate the maintenance of roads.
- new lane markings are painted onto the road and old lane markings can either be scraped away or painted in the color of the road in an effort to mask the old lane markings.
- masking attempts are not always fully successful and old lane markings can sometimes be seen after scraping or re-painting. As such, conditions may exist where two sets of lane markings can be seen for a particular lane of the road. Determining which set of lane markings to follow in an automated manner can be difficult.
- FIG. 1 illustrates an example block diagram of a computing device.
- FIG. 2 illustrates an example computer architecture that facilitates distinguishing lane markings for a vehicle to follow.
- FIG. 3 illustrates a flow chart of an example method for distinguishing lane markings for a vehicle to follow.
- FIG. 4 illustrates an example roadway scenario.
- the present invention extends to methods, systems, and computer program products for distinguishing lane markings for a vehicle to follow.
- Embodiments of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below.
- Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures.
- Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system.
- Computer-readable media that store computer-executable instructions are computer storage media (devices).
- Computer-readable media that carry computer-executable instructions are transmission media.
- embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media (devices) and transmission media.
- Computer storage media includes RAM, ROM, EEPROM, CD-ROM, solid state drives (“SSDs”) (e.g., based on RAM), Flash memory, phase-change memory (“PCM”), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
- SSDs solid state drives
- PCM phase-change memory
- a “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices.
- a network or another communications connection can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
- program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (devices) (or vice versa).
- computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media (devices) at a computer system.
- RAM can also include solid state drives (SSDs or PCIx based real time memory tiered Storage, such as FusionIO).
- SSDs solid state drives
- PCIx based real time memory tiered Storage such as FusionIO
- Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
- the computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
- the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like.
- the invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
- program modules may be located in both local and remote memory storage devices.
- Embodiments of the invention can also be implemented in cloud computing environments.
- cloud computing is defined as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned via virtualization and released with minimal management effort or service provider interaction, and then scaled accordingly.
- configurable computing resources e.g., networks, servers, storage, applications, and services
- a cloud model can be composed of various characteristics (e.g., on-demand self-service, broad network access, resource pooling, rapid elasticity, measured service, etc.), service models (e.g., Software as a Service (SaaS), Platform as a Service (PaaS), Infrastructure as a Service (IaaS), and deployment models (e.g., private cloud, community cloud, public cloud, hybrid cloud, etc.). Databases and servers described with respect to the present invention can be included in a cloud model.
- service models e.g., Software as a Service (SaaS), Platform as a Service (PaaS), Infrastructure as a Service (IaaS)
- deployment models e.g., private cloud, community cloud, public cloud, hybrid cloud, etc.
- ASICs application specific integrated circuits
- aspects of the invention are directed to distinguishing lane markings for a vehicle to follow.
- At least two different types of sensory devices gather data related to the vehicles environment. The data is processed to identify road surface markings and neighboring vehicles. Important regions of interest are extracted from the processed data.
- Lane marking data and neighboring vehicle data is utilized to identify the correct set of lane markings for a vehicle to follow. The correct lane markings are utilized by the vehicle computer system and control system to navigate the vehicle on the correct lane of the road.
- FIG. 1 illustrates an example block diagram of a computing device 100 .
- Computing device 100 can be used to perform various procedures, such as those discussed herein.
- Computing device 100 can function as a server, a client, or any other computing entity.
- Computing device 100 can perform various communication and data transfer functions as described herein and can execute one or more application programs, such as the application programs described herein.
- Computing device 100 can be any of a wide variety of computing devices, such as a mobile telephone or other mobile device, a desktop computer, a notebook computer, a server computer, a handheld computer, tablet computer and the like.
- Computing device 100 includes one or more processor(s) 102 , one or more memory device(s) 104 , one or more interface(s) 106 , one or more mass storage device(s) 108 , one or more Input/Output (I/O) device(s) 110 , and a display device 130 all of which are coupled to a bus 112 .
- Processor(s) 102 include one or more processors or controllers that execute instructions stored in memory device(s) 104 and/or mass storage device(s) 108 .
- Processor(s) 102 may also include various types of computer storage media, such as cache memory.
- Memory device(s) 104 include various computer storage media, such as volatile memory (e.g., random access memory (RAM) 114 ) and/or nonvolatile memory (e.g., read-only memory (ROM) 116 ). Memory device(s) 104 may also include rewritable ROM, such as Flash memory.
- volatile memory e.g., random access memory (RAM) 114
- ROM read-only memory
- Memory device(s) 104 may also include rewritable ROM, such as Flash memory.
- Mass storage device(s) 108 include various computer storage media, such as magnetic tapes, magnetic disks, optical disks, solid state memory (e.g., Flash memory), and so forth. As depicted in FIG. 1 , a particular mass storage device is a hard disk drive 124 . Various drives may also be included in mass storage device(s) 108 to enable reading from and/or writing to the various computer readable media. Mass storage device(s) 108 include removable media 126 and/or non-removable media.
- I/O device(s) 110 include various devices that allow data and/or other information to be input to or retrieved from computing device 100 .
- Example I/O device(s) 110 include cursor control devices, keyboards, keypads, barcode scanners, microphones, monitors or other display devices, speakers, printers, network interface cards, modems, cameras, lenses, CCDs or other image capture devices, and the like.
- Display device 130 includes any type of device capable of displaying information to one or more users of computing device 100 .
- Examples of display device 130 include a monitor, display terminal, video projection device, and the like.
- Interface(s) 106 include various interfaces that allow computing device 100 to interact with other systems, devices, or computing environments as well as humans.
- Example interface(s) 106 can include any number of different network interfaces 120 , such as interfaces to personal area networks (PANs), local area networks (LANs), wide area networks (WANs), wireless networks (e.g., near field communication (NFC), Bluetooth, Wi-Fi, etc., networks), and the Internet.
- Other interfaces include user interface 118 and peripheral device interface 122 .
- Bus 112 allows processor(s) 102 , memory device(s) 104 , interface(s) 106 , mass storage device(s) 108 , and I/O device(s) 110 to communicate with one another, as well as other devices or components coupled to bus 112 .
- Bus 112 represents one or more of several types of bus structures, such as a system bus, PCI bus, IEEE 1394 bus, USB bus, and so forth.
- aspects of the invention can be used to identify and compare road surface markings in order to correctly abide by relevant (e.g., newer) road instructions.
- Camera, lidar (light detection and ranging), and other range sensors can be used to capture information of the road and surrounding vehicles.
- Computer vision and sensor fusion algorithms employing neural networks can be trained to recognize surrounding vehicles and conflicting road surface marking regions of interest on roadways and parking lots. When multiple sets of markings are detected, factors including intensity of color, reflectivity, presence of lane reflectors, paint cracking, and road tape peeling can be used to compare surface marking sets and determine relevant (e.g., newer) markings to follow. Other roadway marking information can also be used. The behavior of other vehicles can also be used to add certainty when determining a set of markings to follow.
- Identifying and comparing road surface markings can include: (1) Using neural networks to identify surface markings on the road and other vehicles within sensor viewing ranges, (2) Determining bounding boxes for identified regions of interest, (3) Sending image data within bounding boxes to algorithms configured to determine if there are multiple sets of road surface markings, and if so, compare the sets to determine the relevant set of markings to follow, and (4) Sending the relevant road surface markings to the vehicle decision making algorithms.
- FIG. 2 illustrates an example computer architecture 200 that facilitates distinguishing lane markings for a vehicle to follow.
- Computer architecture 200 can be contained within a vehicle, such as, for example, a car, a truck, a bus, or a motorcycle.
- computer architecture 200 includes computer system 201 , image-capture device 210 , and lidar system 220 .
- radar system 230 is also included.
- Each of computer system 201 , image-capture device 210 , lidar system 220 , and radar system 230 , as well as their respective components can be connected to one another over (or be part of) a network, such as, for example, a PAN, a LAN, a WAN, and even the Internet.
- a network such as, for example, a PAN, a LAN, a WAN, and even the Internet.
- each of computer system 201 , image-capture device 210 , lidar system 220 , and radar system 230 as well as any other connected computer systems and their components, can create message related data and exchange message related data (e.g., near field communication (NFC) payloads, Bluetooth packets, Internet Protocol (IP) datagrams and other higher layer protocols that utilize IP datagrams, such as, Transmission Control Protocol (TCP), Hypertext Transfer Protocol (HTTP), Simple Mail Transfer Protocol (SMTP), etc.) over the network.
- NFC near field communication
- IP Internet Protocol
- TCP Transmission Control Protocol
- HTTP Hypertext Transfer Protocol
- SMTP Simple Mail Transfer Protocol
- each of image-capture device 210 , lidar system 220 , and possibly radar system 230 are included in a vehicle.
- Each of the sensing devices is configured to sense data in the vicinity of the vehicle.
- image-capture device 210 is configured to capture image data 212
- lidar system 220 is configured to capture lidar data 222
- radar system 230 is configured to capture radar data 232 .
- computer system 201 includes data processing module 240 and control system module 250 .
- Data processing module 240 is configured to receive data from the sensing devices.
- data processing module 240 can receive image data 212 from image-capture device 210 , lidar data 222 from lidar system 220 , and in some aspects, radar data 232 from radar system 230 .
- Data processing module 240 further includes road surface marking identification module 242 , neighboring vehicle identification module 244 , region of interest module 246 , and lane identifying module 248 .
- Road surface marking identification module 242 is configured to identify road markings in the vicinity of a vehicle from information contained in received sensor data.
- Information can include attributes, such as, for example, color intensity, reflectivity, presence of lane reflectors, paint cracking, and road tape peeling. Other roadway marking information can also be used.
- Neighboring vehicle identification module 244 is configured to detect neighboring vehicles from information contained in received sensor data. For example, neighboring vehicle information module 244 can detect the location and trajectory of neighboring vehicles. Detected vehicles may be going in the same direction as a vehicle that includes image-capture device 210 , lidar system 220 , and possibly radar system 230 (e.g., in the same lane as the vehicle or in a lane adjacent to the vehicle). Detected vehicles may also be travelling toward the automated vehicle (e.g., in oncoming traffic).
- Region of interest module 246 is configured to identify lane marking data for determining the correct lane markings to follow. For example, region of interest module 246 can identify lane marking data 260 relative to a vehicle and exclude other road surface markings such as speed limit information, HOV markings, and the like. Region of interest module 246 can also identify neighboring vehicle data 262 related to the location and trajectory of neighboring vehicles in relevant proximity to a vehicle.
- Region of interest module 246 is configured to pass lane marking data 260 and neighboring vehicle data 262 to lane identifying module 248 .
- Lane identifying module 248 is configured to utilize the attributes of the lane marking data 260 to determine the relevant set of lane markings to follow.
- the attributes of the lane markings may include intensity of color, reflectivity, presence of lane reflectors, paint cracking, and road tape peeling. Attributes of other roadway markings can also be used.
- Lane identifying module 248 can utilize neighboring vehicle data 262 to determine the location and trajectory of neighboring vehicles. Once lane identifying module 248 has identified relevant lane markings to follow, lane identifying module can use the location and trajectory of neighboring vehicles to increase the confidence level that the relevant lane markings are in fact the lane markings to follow.
- Control system module 250 is configured to utilize the relevant lane markings to control the vehicle's location and trajectory.
- FIG. 3 illustrates a flow chart of an example method 300 for distinguishing lane markings for a vehicle to follow. Method 300 will be described with respect to the components and data of computer architecture 200 .
- Method 300 includes accessing sensor data which has been gathered by a plurality of sensors, the plurality of sensors including a first type of sensor and at least a second different type of sensor, the sensor data indicating road-surface markings, the road-surface markings including intensity of color and reflectivity, the sensor data also indicating the location and trajectory of neighboring vehicles ( 301 ).
- a vehicle can include a computer system 201 , image-capture device 210 , lidar system 220 , and possibly also radar system 230 .
- Other sensors are possible, as well.
- Image-capture device 210 can capture image data 212
- lidar system 220 can capture lidar data 222
- radar system 230 can capture radar data 232 .
- Data processing module 240 can access image data 212 , lidar data 222 , and, when appropriate, radar data 232 .
- image data 212 , lidar data 222 , and, when appropriate, radar data 232 (hereinafter also referred to as the “accessed sensor data”) can indicate road surface markings and neighboring vehicles in the vicinity of the vehicle.
- the indication of road surface markings can include data for determining color intensity and reflectivity for the road surface markings.
- the indication of neighboring vehicles can include data for determining location and trajectory.
- Road surface marking module 242 can utilize the accessed sensor data to identify road surface markings on the road. For example, road surface marking module 242 can identify painted lane markings, turn lane markings, stop sign markings, speed limit markings, and railroad crossing markings, just to name a few.
- Neighboring vehicle identification module 244 can utilize the accessed sensor data to identify neighboring vehicles. For example, neighboring vehicle identification module 244 can identify the location and trajectory of neighboring vehicles on the road with the vehicle.
- Method 300 includes determining bounding boxes from the accessed sensor data, the bounding boxes representing regions of interest for distinguishing road-surface markings and neighboring vehicles ( 302 ).
- region of interest module 246 can determine bounding boxes from image data 212 , lidar data 222 , and, when appropriate, radar data 232 .
- image data 212 depicts an image of the road and the associated lane markings and vehicles on the road at the time of the image capture.
- Region of interest module 246 can isolate the lane marking data and neighboring vehicle data necessary for identifying the lane for the vehicle to follow.
- road markings in image data 212 can include two solid yellow lines, two faded yellow lines, a solid dashed white line, a faded dashed white line, a solid white line, and a faded white line.
- the faded lines are not as bright or as prominent as the other lines described herein.
- Region of interest module 246 can utilize bounding box algorithms to determine the bounding box of the solid yellow lines, the bounding box of the faded yellow lines, the bounding box of the solid dashed white line, the bounding box of the faded dashed white line, the bounding box of the solid white line, and the bounding box of the faded white line.
- Region of interest module 246 can also isolate neighboring vehicle data from the accessed sensor data utilizing bounding boxes.
- the accessed sensor data can indicate other vehicles on the road.
- the accessed sensor data may depict a vehicle in the same lane, a vehicle in the adjacent lane, an oncoming vehicle in the adjacent lane, and an oncoming vehicle in the far lane.
- Region of interest module 246 can utilize bounding box algorithms to determine any of: a bounding box of a vehicle in the same lane, a bounding box of a vehicle in the adjacent lane, a bounding box of the oncoming vehicle in an adjacent lane, and a bounding box of a vehicle in an oncoming far lane.
- Method 300 includes processing the accessed sensor data within the bounding boxes to determine that multiple road-surface markings are present ( 303 ).
- data processing module 240 can process the accessed sensor data contained within the bounding boxes to determine that multiple lane markings exist in the lane of the road that the vehicle is traveling.
- the multiple lane markings can include two solid yellow lines and two faded yellow lines.
- Data processing module 240 can recognize the two sets of road-surface markings as ambiguous and/or conflicting and determine that multiple road-surface markings are present.
- Data processing module 240 can represent the two sets of road-surface markings in lane marking data 260 .
- Data processing module 240 can also represent neighboring vehicles identified in neighboring vehicle data 262 .
- neighboring vehicle data 262 can identify the location and trajectory of neighboring vehicles.
- method 300 includes utilizing the sensor data to identify road-surface markings, from among the multiple road-surface markings, the vehicle is to follow ( 304 ).
- lane identifying module 248 can utilize lane marking data 260 to identify the road-surface markings, from among the multiple road-surface markings, the vehicle is to follow.
- Lane identifying module 248 can utilize the accessed sensor data to compare the intensity of color, reflectivity, presence of lane reflectors, paint cracking, and road tape peeling of both the two solid yellow lines and the two faded solid yellow lines and the solid dashed white line and the faded dashed white line. Other attributes can be compared, as well.
- the lane identifying module 248 can identify the two solid yellow lines and the solid dashed white line as the relevant lane markings for the lane for which the vehicle is to follow.
- method 300 includes using the location and trajectory of neighboring vehicles to increase confidence with respect to the identified road-surface markings ( 305 ).
- lane identifying module 248 can utilize neighboring vehicle data 262 to identify the location and trajectory of neighboring vehicles to increase confidence with respect to identification of the two solid yellow lines and the solid dashed white line as the relevant lane markings.
- Lane identifying module 248 can determine that a vehicle is moving in the same direction and in the same lane as the vehicle.
- Lane identifying module 248 can determine that a neighboring vehicle is moving in the same direction as the vehicle in an adjacent lane.
- lane identifying module 248 can identify other neighboring vehicles as vehicles going in the opposite direction and in the adjacent lanes of the vehicle. Given the location and trajectory information of the neighboring vehicles, the lane identifying module 248 has increased confidence that two solid yellow lines and the solid dashed white line are relevant.
- method 300 includes sending the identified road-surface markings to the vehicle's decision making algorithms for use in controlling the vehicle's location and trajectory ( 306 ).
- data processing module 240 can send the two solid yellow lines and the solid dashed white line to control system module 250 for use in controlling the vehicle's location and trajectory.
- FIG. 4 illustrates an example roadway scenario 400 .
- road 402 includes lanes 403 , 404 , 405 , and 406 .
- Vehicle 450 is traveling in lane 403 on road 402 .
- Vehicle 450 includes a computer system (not shown) that is configured to control the vehicle 450 in an autonomous mode.
- the computer system is similar to computer system 201 .
- the computer system may use lane information, such as lane markings, to estimate the bounds of lane 403 .
- Vehicle 450 includes image-capture device 452 , lidar unit 454 , and/or radar unit 456 .
- Other sensors are possible, as well.
- Vehicle 450 can use the plurality of sensors to obtain information about lane 403 .
- the computer system can use image-capture device 452 and lidar unit 454 to sense lane markings of lane 403 .
- radar unit 456 may also be used to gather information about the environment of vehicle 450 .
- the computer system can be configured to identify regions of interest from the accessed sensor data utilizing bounding boxes.
- the accessed data from image-capture device 452 may depict an image of the road 402 .
- the image can include associated lane markings and vehicles on the road 402 at the time of the image capture.
- the computer system can isolate the lane marking data and neighboring vehicle data necessary for identifying the bounds of lane 403 for vehicle 450 to follow.
- the computer system can isolate road marking data from the accessed sensor data utilizing bounding boxes.
- the road markings in the accessed sensor data may include two solid yellow lines 410 , two faded yellow lines 412 , a solid dashed white line 420 , a faded dashed white line 422 , a solid white line 430 , and a faded white line 432 .
- the computer system can utilize bounding box algorithms to determine: a bounding box 411 of the solid yellow lines 410 , a bounding box 413 of the faded yellow lines 412 , a bounding box 421 of the solid dashed white line 420 , a bounding box 423 of the faded dashed white line 422 , a bounding box 431 of the solid white line 430 , and a bounding box 433 of the faded white line 432 .
- the computer system can also isolate neighboring vehicle data from the accessed sensor data utilizing bounding boxes.
- the accessed sensor data may depict other vehicles on the road.
- the accessed sensor data can indicate vehicle 460 in lane 403 , vehicle 462 in lane 404 , an oncoming vehicle 464 in lane 405 , and an oncoming vehicle 466 in lane 406 .
- the computer system can utilize bounding box algorithms to determine: a bounding box 461 for vehicle 460 , a bounding box 463 for vehicle 462 , a bounding box 465 for oncoming vehicle 464 , and a bounding box 467 for vehicle 466 .
- the computer system can process the accessed sensor data contained within the bounding boxes to determine that multiple lane markings exist in the lane 403 .
- the road markings in the accessed sensor data within the bounding boxes may depict both two solid yellow lines 410 and two faded yellow lines 412 .
- the computer system can recognize the two sets of road-surface markings as ambiguous and/or conflicting sets of lane markings. As such, the computer system determines that multiple road-surface markings are present.
- the computer system can utilize the accessed sensor data to identify relevant road-surface markings that vehicle 450 is to follow. For example, the computer system can utilize the accessed sensor data to compare the intensity of color, reflectivity, presence of lane reflectors, paint cracking, and road tape peeling of the two solid yellow lines 410 and the two faded yellow lines 412 . Similarly, the computer system can utilize the accessed sensor data to compare the intensity of color, reflectivity, presence of lane reflectors, paint cracking, and road tape peeling of the solid white dashed line 420 and the faded dashed white line 422 . Other factors may also be used. Based on the accessed sensor data, the computer system can identify the two solid yellow lines 410 and the solid dashed white line 420 as the relevant (and correct) lane markings bounding lane 403 .
- the computer system is able to use the location and trajectory of neighboring vehicles to increase confidence with respect to identification of the two solid yellow lines 410 and the solid dashed white line 420 .
- the accessed sensor data can indicate the location and trajectory of vehicle 460 .
- the computer system can determine that vehicle 460 is moving in lane 403 in the same direction as vehicle 450 .
- the accessed sensor data can indicate the location and trajectory of vehicle 462 .
- the computer system 401 can determine that vehicle 462 is moving in lane 404 in the same direction as vehicle 450 .
- the computer system 401 can identify vehicle 464 and vehicle 466 going in the opposite direction of vehicle 450 in lanes 405 and 406 respectively. Given the location and trajectory information of the neighboring vehicles, the computer system has increased confidence that the two solid yellow lines 410 and the solid dashed white line 420 were appropriately identified.
- the computer system can pass the identified correct lane markings to vehicle 450 's decision making algorithms for use in controlling the vehicle's location and trajectory.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Navigation (AREA)
Abstract
The present invention extends to methods, systems, and computer program products for distinguishing lane markings for a vehicle to follow. Automated driving or driving assist vehicles utilize sensors to help the vehicle navigate on roadways or in parking areas. The sensors can utilize, for example, the painted surface markings to help guide the vehicle on its path. Aspects of the invention use a first type of sensor and at least a second different type of sensor to identify road surface markings. When ambiguity is detected between road surface markings, decision making algorithms identify the correct set of markings for a vehicle to abide by. The sensors also identify the location and trajectory of neighboring vehicles to increase confidence with respect to the identified road-surface markings.
Description
- Not applicable.
- 1. Field of the Invention
- This invention relates generally to the field of road line detection systems, and, more particularly, to road line detection systems which can distinguish lane markings for a vehicle to follow.
- 2. Related Art
- Many industries, including the automotive industry, are utilizing advanced automation capabilities. The opportunity exists to automate many of the functions of a traditional vehicle. Some research is being performed in the automotive industry to make a vehicle completely autonomous; requiring no human input at all.
- Several benefits exist as the result of vehicles becoming more automated or completely autonomous. Benefits include reduced traffic accidents, improved traffic flow, and reduced fuel consumption, just to name a few. For example, many vehicles are utilizing collision avoidance technology where the vehicle's computer system can apply the brakes if the vehicle's sensors detect that the vehicle is in danger of colliding with another vehicle or object. Also, many vehicles have introduced a parallel parking feature which allows the vehicle to parallel park without any input from the driver of the vehicle.
- The plausibility of automated or autonomous vehicles is heavily dependent on the ability of the vehicle sensors and the vehicle computer system to identify and process the vehicle's environment and to react to different situations in a safe and efficient manner. However, the vehicle environment is very dynamic. Thus, the vehicle computer system needs to be able to perform in varying weather conditions, such as, rain or snow, and react to various road conditions, such as, icy roads or roads that are under construction. The challenge exists to guarantee a safe and efficient driving experience, regardless of the vehicle environment.
- Some vehicle automation techniques used for vehicle navigation consider the painted lane markings on the surface of a road. The lane markings can be used as a means to safely guide the vehicle along its route. However, at least a few difficulties arise from the use of lane markings. For example, road surface markings not only include lane markings, but they also include navigation information, such as, turning lane indicators, speed limit information, crosswalk information, railroad information, and high-occupancy vehicle (HOV) markings. Distinguishing between lane markings and other road surface markings can be difficult.
- Another challenge is distinguishing between ambiguous lane markers that can occur, for example, during road construction, maintenance, or upgrades. Lanes can be shifted to one side or another during periods of construction to facilitate the maintenance of roads. During the period of construction road maintenance, or upgrades, new lane markings are painted onto the road and old lane markings can either be scraped away or painted in the color of the road in an effort to mask the old lane markings. However, masking attempts are not always fully successful and old lane markings can sometimes be seen after scraping or re-painting. As such, conditions may exist where two sets of lane markings can be seen for a particular lane of the road. Determining which set of lane markings to follow in an automated manner can be difficult.
- The specific features, aspects and advantages of the present invention will become better understood with regard to the following description and accompanying drawings where:
-
FIG. 1 illustrates an example block diagram of a computing device. -
FIG. 2 illustrates an example computer architecture that facilitates distinguishing lane markings for a vehicle to follow. -
FIG. 3 illustrates a flow chart of an example method for distinguishing lane markings for a vehicle to follow. -
FIG. 4 illustrates an example roadway scenario. - The present invention extends to methods, systems, and computer program products for distinguishing lane markings for a vehicle to follow.
- Embodiments of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below. Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are computer storage media (devices). Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media (devices) and transmission media.
- Computer storage media (devices) includes RAM, ROM, EEPROM, CD-ROM, solid state drives (“SSDs”) (e.g., based on RAM), Flash memory, phase-change memory (“PCM”), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
- A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
- Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (devices) (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media (devices) at a computer system. RAM can also include solid state drives (SSDs or PCIx based real time memory tiered Storage, such as FusionIO). Thus, it should be understood that computer storage media (devices) can be included in computer system components that also (or even primarily) utilize transmission media.
- Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
- Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
- Embodiments of the invention can also be implemented in cloud computing environments. In this description and the following claims, “cloud computing” is defined as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned via virtualization and released with minimal management effort or service provider interaction, and then scaled accordingly. A cloud model can be composed of various characteristics (e.g., on-demand self-service, broad network access, resource pooling, rapid elasticity, measured service, etc.), service models (e.g., Software as a Service (SaaS), Platform as a Service (PaaS), Infrastructure as a Service (IaaS), and deployment models (e.g., private cloud, community cloud, public cloud, hybrid cloud, etc.). Databases and servers described with respect to the present invention can be included in a cloud model.
- Further, where appropriate, functions described herein can be performed in one or more of: hardware, software, firmware, digital components, or analog components. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein. Certain terms are used throughout the following description and Claims to refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function.
- In general, aspects of the invention are directed to distinguishing lane markings for a vehicle to follow. At least two different types of sensory devices gather data related to the vehicles environment. The data is processed to identify road surface markings and neighboring vehicles. Important regions of interest are extracted from the processed data. Lane marking data and neighboring vehicle data is utilized to identify the correct set of lane markings for a vehicle to follow. The correct lane markings are utilized by the vehicle computer system and control system to navigate the vehicle on the correct lane of the road.
-
FIG. 1 illustrates an example block diagram of acomputing device 100.Computing device 100 can be used to perform various procedures, such as those discussed herein.Computing device 100 can function as a server, a client, or any other computing entity.Computing device 100 can perform various communication and data transfer functions as described herein and can execute one or more application programs, such as the application programs described herein.Computing device 100 can be any of a wide variety of computing devices, such as a mobile telephone or other mobile device, a desktop computer, a notebook computer, a server computer, a handheld computer, tablet computer and the like. -
Computing device 100 includes one or more processor(s) 102, one or more memory device(s) 104, one or more interface(s) 106, one or more mass storage device(s) 108, one or more Input/Output (I/O) device(s) 110, and adisplay device 130 all of which are coupled to abus 112. Processor(s) 102 include one or more processors or controllers that execute instructions stored in memory device(s) 104 and/or mass storage device(s) 108. Processor(s) 102 may also include various types of computer storage media, such as cache memory. - Memory device(s) 104 include various computer storage media, such as volatile memory (e.g., random access memory (RAM) 114) and/or nonvolatile memory (e.g., read-only memory (ROM) 116). Memory device(s) 104 may also include rewritable ROM, such as Flash memory.
- Mass storage device(s) 108 include various computer storage media, such as magnetic tapes, magnetic disks, optical disks, solid state memory (e.g., Flash memory), and so forth. As depicted in
FIG. 1 , a particular mass storage device is ahard disk drive 124. Various drives may also be included in mass storage device(s) 108 to enable reading from and/or writing to the various computer readable media. Mass storage device(s) 108 includeremovable media 126 and/or non-removable media. - I/O device(s) 110 include various devices that allow data and/or other information to be input to or retrieved from
computing device 100. Example I/O device(s) 110 include cursor control devices, keyboards, keypads, barcode scanners, microphones, monitors or other display devices, speakers, printers, network interface cards, modems, cameras, lenses, CCDs or other image capture devices, and the like. -
Display device 130 includes any type of device capable of displaying information to one or more users ofcomputing device 100. Examples ofdisplay device 130 include a monitor, display terminal, video projection device, and the like. - Interface(s) 106 include various interfaces that allow
computing device 100 to interact with other systems, devices, or computing environments as well as humans. Example interface(s) 106 can include any number ofdifferent network interfaces 120, such as interfaces to personal area networks (PANs), local area networks (LANs), wide area networks (WANs), wireless networks (e.g., near field communication (NFC), Bluetooth, Wi-Fi, etc., networks), and the Internet. Other interfaces include user interface 118 andperipheral device interface 122. -
Bus 112 allows processor(s) 102, memory device(s) 104, interface(s) 106, mass storage device(s) 108, and I/O device(s) 110 to communicate with one another, as well as other devices or components coupled tobus 112.Bus 112 represents one or more of several types of bus structures, such as a system bus, PCI bus, IEEE 1394 bus, USB bus, and so forth. - Aspects of the invention can be used to identify and compare road surface markings in order to correctly abide by relevant (e.g., newer) road instructions. Camera, lidar (light detection and ranging), and other range sensors can be used to capture information of the road and surrounding vehicles. Computer vision and sensor fusion algorithms employing neural networks can be trained to recognize surrounding vehicles and conflicting road surface marking regions of interest on roadways and parking lots. When multiple sets of markings are detected, factors including intensity of color, reflectivity, presence of lane reflectors, paint cracking, and road tape peeling can be used to compare surface marking sets and determine relevant (e.g., newer) markings to follow. Other roadway marking information can also be used. The behavior of other vehicles can also be used to add certainty when determining a set of markings to follow.
- Identifying and comparing road surface markings can include: (1) Using neural networks to identify surface markings on the road and other vehicles within sensor viewing ranges, (2) Determining bounding boxes for identified regions of interest, (3) Sending image data within bounding boxes to algorithms configured to determine if there are multiple sets of road surface markings, and if so, compare the sets to determine the relevant set of markings to follow, and (4) Sending the relevant road surface markings to the vehicle decision making algorithms.
-
FIG. 2 illustrates anexample computer architecture 200 that facilitates distinguishing lane markings for a vehicle to follow.Computer architecture 200 can be contained within a vehicle, such as, for example, a car, a truck, a bus, or a motorcycle. Referring toFIG. 2 ,computer architecture 200 includescomputer system 201, image-capture device 210, andlidar system 220. In some embodiments,radar system 230 is also included. Each ofcomputer system 201, image-capture device 210,lidar system 220, andradar system 230, as well as their respective components can be connected to one another over (or be part of) a network, such as, for example, a PAN, a LAN, a WAN, and even the Internet. Accordingly, each ofcomputer system 201, image-capture device 210,lidar system 220, andradar system 230 as well as any other connected computer systems and their components, can create message related data and exchange message related data (e.g., near field communication (NFC) payloads, Bluetooth packets, Internet Protocol (IP) datagrams and other higher layer protocols that utilize IP datagrams, such as, Transmission Control Protocol (TCP), Hypertext Transfer Protocol (HTTP), Simple Mail Transfer Protocol (SMTP), etc.) over the network. - In one aspect, each of image-
capture device 210,lidar system 220, and possiblyradar system 230 are included in a vehicle. Each of the sensing devices is configured to sense data in the vicinity of the vehicle. For example, image-capture device 210 is configured to captureimage data 212,lidar system 220 is configured to capture lidar data 222, andradar system 230 is configured to capture radar data 232. - As depicted,
computer system 201 includesdata processing module 240 andcontrol system module 250.Data processing module 240 is configured to receive data from the sensing devices. For example,data processing module 240 can receiveimage data 212 from image-capture device 210, lidar data 222 fromlidar system 220, and in some aspects, radar data 232 fromradar system 230. -
Data processing module 240 further includes road surface markingidentification module 242, neighboringvehicle identification module 244, region ofinterest module 246, andlane identifying module 248. - Road surface marking
identification module 242 is configured to identify road markings in the vicinity of a vehicle from information contained in received sensor data. Information can include attributes, such as, for example, color intensity, reflectivity, presence of lane reflectors, paint cracking, and road tape peeling. Other roadway marking information can also be used. - Neighboring
vehicle identification module 244 is configured to detect neighboring vehicles from information contained in received sensor data. For example, neighboringvehicle information module 244 can detect the location and trajectory of neighboring vehicles. Detected vehicles may be going in the same direction as a vehicle that includes image-capture device 210,lidar system 220, and possibly radar system 230 (e.g., in the same lane as the vehicle or in a lane adjacent to the vehicle). Detected vehicles may also be travelling toward the automated vehicle (e.g., in oncoming traffic). - Region of
interest module 246 is configured to identify lane marking data for determining the correct lane markings to follow. For example, region ofinterest module 246 can identifylane marking data 260 relative to a vehicle and exclude other road surface markings such as speed limit information, HOV markings, and the like. Region ofinterest module 246 can also identify neighboringvehicle data 262 related to the location and trajectory of neighboring vehicles in relevant proximity to a vehicle. - Region of
interest module 246 is configured to passlane marking data 260 and neighboringvehicle data 262 to lane identifyingmodule 248.Lane identifying module 248 is configured to utilize the attributes of thelane marking data 260 to determine the relevant set of lane markings to follow. For example, the attributes of the lane markings may include intensity of color, reflectivity, presence of lane reflectors, paint cracking, and road tape peeling. Attributes of other roadway markings can also be used.Lane identifying module 248 can utilize neighboringvehicle data 262 to determine the location and trajectory of neighboring vehicles. Oncelane identifying module 248 has identified relevant lane markings to follow, lane identifying module can use the location and trajectory of neighboring vehicles to increase the confidence level that the relevant lane markings are in fact the lane markings to follow. -
Control system module 250 is configured to utilize the relevant lane markings to control the vehicle's location and trajectory. -
FIG. 3 illustrates a flow chart of anexample method 300 for distinguishing lane markings for a vehicle to follow.Method 300 will be described with respect to the components and data ofcomputer architecture 200. -
Method 300 includes accessing sensor data which has been gathered by a plurality of sensors, the plurality of sensors including a first type of sensor and at least a second different type of sensor, the sensor data indicating road-surface markings, the road-surface markings including intensity of color and reflectivity, the sensor data also indicating the location and trajectory of neighboring vehicles (301). For example, a vehicle can include acomputer system 201, image-capture device 210,lidar system 220, and possibly alsoradar system 230. Other sensors are possible, as well. Image-capture device 210 can captureimage data 212,lidar system 220 can capture lidar data 222, and, when included,radar system 230 can capture radar data 232.Data processing module 240 can accessimage data 212, lidar data 222, and, when appropriate, radar data 232. Collectively,image data 212, lidar data 222, and, when appropriate, radar data 232 (hereinafter also referred to as the “accessed sensor data”) can indicate road surface markings and neighboring vehicles in the vicinity of the vehicle. The indication of road surface markings can include data for determining color intensity and reflectivity for the road surface markings. The indication of neighboring vehicles can include data for determining location and trajectory. - Road
surface marking module 242 can utilize the accessed sensor data to identify road surface markings on the road. For example, roadsurface marking module 242 can identify painted lane markings, turn lane markings, stop sign markings, speed limit markings, and railroad crossing markings, just to name a few. - Neighboring
vehicle identification module 244 can utilize the accessed sensor data to identify neighboring vehicles. For example, neighboringvehicle identification module 244 can identify the location and trajectory of neighboring vehicles on the road with the vehicle. -
Method 300 includes determining bounding boxes from the accessed sensor data, the bounding boxes representing regions of interest for distinguishing road-surface markings and neighboring vehicles (302). For example, region ofinterest module 246 can determine bounding boxes fromimage data 212, lidar data 222, and, when appropriate, radar data 232. In one aspect,image data 212 depicts an image of the road and the associated lane markings and vehicles on the road at the time of the image capture. Region ofinterest module 246 can isolate the lane marking data and neighboring vehicle data necessary for identifying the lane for the vehicle to follow. - For example, road markings in
image data 212 can include two solid yellow lines, two faded yellow lines, a solid dashed white line, a faded dashed white line, a solid white line, and a faded white line. The faded lines are not as bright or as prominent as the other lines described herein. Region ofinterest module 246 can utilize bounding box algorithms to determine the bounding box of the solid yellow lines, the bounding box of the faded yellow lines, the bounding box of the solid dashed white line, the bounding box of the faded dashed white line, the bounding box of the solid white line, and the bounding box of the faded white line. - Region of
interest module 246 can also isolate neighboring vehicle data from the accessed sensor data utilizing bounding boxes. The accessed sensor data can indicate other vehicles on the road. For example, the accessed sensor data may depict a vehicle in the same lane, a vehicle in the adjacent lane, an oncoming vehicle in the adjacent lane, and an oncoming vehicle in the far lane. Region ofinterest module 246 can utilize bounding box algorithms to determine any of: a bounding box of a vehicle in the same lane, a bounding box of a vehicle in the adjacent lane, a bounding box of the oncoming vehicle in an adjacent lane, and a bounding box of a vehicle in an oncoming far lane. -
Method 300 includes processing the accessed sensor data within the bounding boxes to determine that multiple road-surface markings are present (303). For example,data processing module 240 can process the accessed sensor data contained within the bounding boxes to determine that multiple lane markings exist in the lane of the road that the vehicle is traveling. The multiple lane markings can include two solid yellow lines and two faded yellow lines.Data processing module 240 can recognize the two sets of road-surface markings as ambiguous and/or conflicting and determine that multiple road-surface markings are present.Data processing module 240 can represent the two sets of road-surface markings inlane marking data 260. -
Data processing module 240 can also represent neighboring vehicles identified in neighboringvehicle data 262. For example, neighboringvehicle data 262 can identify the location and trajectory of neighboring vehicles. - In response to determining that multiple road-surface markings are present,
method 300 includes utilizing the sensor data to identify road-surface markings, from among the multiple road-surface markings, the vehicle is to follow (304). For example,lane identifying module 248 can utilizelane marking data 260 to identify the road-surface markings, from among the multiple road-surface markings, the vehicle is to follow.Lane identifying module 248 can utilize the accessed sensor data to compare the intensity of color, reflectivity, presence of lane reflectors, paint cracking, and road tape peeling of both the two solid yellow lines and the two faded solid yellow lines and the solid dashed white line and the faded dashed white line. Other attributes can be compared, as well. Given the accessed sensor data, thelane identifying module 248 can identify the two solid yellow lines and the solid dashed white line as the relevant lane markings for the lane for which the vehicle is to follow. - In response to determining that multiple road-surface markings are present,
method 300 includes using the location and trajectory of neighboring vehicles to increase confidence with respect to the identified road-surface markings (305). For example,lane identifying module 248 can utilize neighboringvehicle data 262 to identify the location and trajectory of neighboring vehicles to increase confidence with respect to identification of the two solid yellow lines and the solid dashed white line as the relevant lane markings.Lane identifying module 248 can determine that a vehicle is moving in the same direction and in the same lane as the vehicle.Lane identifying module 248 can determine that a neighboring vehicle is moving in the same direction as the vehicle in an adjacent lane. Furthermore,lane identifying module 248 can identify other neighboring vehicles as vehicles going in the opposite direction and in the adjacent lanes of the vehicle. Given the location and trajectory information of the neighboring vehicles, thelane identifying module 248 has increased confidence that two solid yellow lines and the solid dashed white line are relevant. - In response to determining that multiple road-surface markings are present,
method 300 includes sending the identified road-surface markings to the vehicle's decision making algorithms for use in controlling the vehicle's location and trajectory (306). For example,data processing module 240 can send the two solid yellow lines and the solid dashed white line to controlsystem module 250 for use in controlling the vehicle's location and trajectory. -
FIG. 4 illustrates anexample roadway scenario 400. As depicted,road 402 includeslanes Vehicle 450 is traveling inlane 403 onroad 402.Vehicle 450 includes a computer system (not shown) that is configured to control thevehicle 450 in an autonomous mode. In one aspect, the computer system is similar tocomputer system 201. To this end, the computer system may use lane information, such as lane markings, to estimate the bounds oflane 403. -
Vehicle 450 includes image-capture device 452,lidar unit 454, and/orradar unit 456. Other sensors are possible, as well.Vehicle 450 can use the plurality of sensors to obtain information aboutlane 403. For example, the computer system can use image-capture device 452 andlidar unit 454 to sense lane markings oflane 403. In some aspects,radar unit 456 may also be used to gather information about the environment ofvehicle 450. - The computer system can be configured to identify regions of interest from the accessed sensor data utilizing bounding boxes. For example, the accessed data from image-
capture device 452 may depict an image of theroad 402. The image can include associated lane markings and vehicles on theroad 402 at the time of the image capture. The computer system can isolate the lane marking data and neighboring vehicle data necessary for identifying the bounds oflane 403 forvehicle 450 to follow. - The computer system can isolate road marking data from the accessed sensor data utilizing bounding boxes. For example, the road markings in the accessed sensor data may include two solid
yellow lines 410, two fadedyellow lines 412, a solid dashedwhite line 420, a faded dashedwhite line 422, a solidwhite line 430, and a fadedwhite line 432. The computer system can utilize bounding box algorithms to determine: abounding box 411 of the solidyellow lines 410, abounding box 413 of the fadedyellow lines 412, abounding box 421 of the solid dashedwhite line 420, abounding box 423 of the faded dashedwhite line 422, abounding box 431 of the solidwhite line 430, and abounding box 433 of the fadedwhite line 432. - The computer system can also isolate neighboring vehicle data from the accessed sensor data utilizing bounding boxes. The accessed sensor data may depict other vehicles on the road. For example, the accessed sensor data can indicate vehicle 460 in
lane 403,vehicle 462 inlane 404, an oncoming vehicle 464 inlane 405, and anoncoming vehicle 466 inlane 406. The computer system can utilize bounding box algorithms to determine: abounding box 461 for vehicle 460, abounding box 463 forvehicle 462, abounding box 465 for oncoming vehicle 464, and abounding box 467 forvehicle 466. - The computer system can process the accessed sensor data contained within the bounding boxes to determine that multiple lane markings exist in the
lane 403. For example, the road markings in the accessed sensor data within the bounding boxes may depict both two solidyellow lines 410 and two fadedyellow lines 412. The computer system can recognize the two sets of road-surface markings as ambiguous and/or conflicting sets of lane markings. As such, the computer system determines that multiple road-surface markings are present. - In response to determining that multiple road-surface markings are present, the computer system can utilize the accessed sensor data to identify relevant road-surface markings that
vehicle 450 is to follow. For example, the computer system can utilize the accessed sensor data to compare the intensity of color, reflectivity, presence of lane reflectors, paint cracking, and road tape peeling of the two solidyellow lines 410 and the two fadedyellow lines 412. Similarly, the computer system can utilize the accessed sensor data to compare the intensity of color, reflectivity, presence of lane reflectors, paint cracking, and road tape peeling of the solid white dashedline 420 and the faded dashedwhite line 422. Other factors may also be used. Based on the accessed sensor data, the computer system can identify the two solidyellow lines 410 and the solid dashedwhite line 420 as the relevant (and correct) lanemarkings bounding lane 403. - Additionally, the computer system is able to use the location and trajectory of neighboring vehicles to increase confidence with respect to identification of the two solid
yellow lines 410 and the solid dashedwhite line 420. For example, the accessed sensor data can indicate the location and trajectory of vehicle 460. The computer system can determine that vehicle 460 is moving inlane 403 in the same direction asvehicle 450. The accessed sensor data can indicate the location and trajectory ofvehicle 462. The computer system 401 can determine thatvehicle 462 is moving inlane 404 in the same direction asvehicle 450. Furthermore, the computer system 401 can identify vehicle 464 andvehicle 466 going in the opposite direction ofvehicle 450 inlanes yellow lines 410 and the solid dashedwhite line 420 were appropriately identified. - After the correct lane markings have been identified, the computer system can pass the identified correct lane markings to
vehicle 450's decision making algorithms for use in controlling the vehicle's location and trajectory. - Although the components and modules illustrated herein are shown and described in a particular arrangement, the arrangement of components and modules may be altered to process data in a different manner. In other embodiments, one or more additional components or modules may be added to the described systems, and one or more components or modules may be removed from the described systems. Alternate embodiments may combine two or more of the described components or modules into a single component or module.
- The foregoing description has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. Further, it should be noted that any or all of the aforementioned alternate embodiments may be used in any combination desired to form additional hybrid embodiments of the invention.
- Further, although specific embodiments of the invention have been described and illustrated, the invention is not to be limited to the specific forms or arrangements of parts so described and illustrated. The scope of the invention is to be defined by the claims appended hereto, any future claims submitted here and in different applications, and their equivalents.
Claims (20)
1. A method for determining lane markings for a vehicle to follow, the method comprising:
accessing sensor data from a plurality of sensors at the vehicle;
processing the accessed sensor data to identify multiple road-surface markings;
utilizing the sensor data to identify road-surface markings, from among the multiple road-surface markings, the vehicle is to follow; and
sending the identified road-surface markings to the vehicle's decision making algorithms for use in controlling the vehicle.
2. The method of claim 1 , further comprising:
processing the accessed sensor data to identify location and trajectory of neighboring vehicles; and
using the location and trajectory of neighboring vehicles to increase confidence with respect to the identified road-surface markings.
3. The method of claim 1 , wherein identifying multiple road-surface markings comprises detecting at least two candidate lane markings of a different visual appearance.
4. The method of claim 3 , wherein detecting at least two candidate lane markings of a different visual appearance comprise detecting at least two candidate lane markings that differ in on or more of: intensity of color, reflectivity, presence of lane reflectors, paint cracking, and road tape peeling.
5. The method claim 1 , wherein accessing sensor data from a plurality of sensors comprises accessing data from two or more of: an image capture device, a lidar system, and a radar system.
6. A method for use at a computer system, the computer system including one or more processors and system memory, the method for distinguishing lane markings for a vehicle to follow, the method comprising:
accessing sensor data which has been gathered by a plurality of sensors, the plurality of sensors including a first type of sensor and at least a second different type of sensor, the sensor data indicating road-surface markings, the road-surface markings including intensity of color and reflectivity, the sensor data also indicating the location and trajectory of neighboring vehicles;
determining bounding boxes from the accessed sensor data, the bounding boxes representing regions of interest for distinguishing road-surface markings and neighboring vehicles;
processing the accessed sensor data within the bounding boxes to determine that multiple road-surface markings are present;
in response to determining that multiple road-surface markings are present:
utilizing the sensor data to identify road-surface markings, from among the multiple road-surface markings, the vehicle is to follow; and
using the location and trajectory of neighboring vehicles to increase confidence with respect to the identified road-surface markings; and
sending the identified road-surface markings to the vehicle's decision making algorithms for use in controlling the vehicle's location and trajectory.
7. The method of claim 6 , wherein determining that multiple road-surface markings are present comprises detecting at least two candidate lane markings of a different visual appearance.
8. The method of claim 7 , further comprising, determining a visual appearance of each of the multiple road-surface markings from one or more of: intensity of color, reflectivity, presence of lane reflectors, paint cracking, and road tape peeling.
9. The method of claim 6 , wherein the sensor data comes from a plurality of sensors, the plurality of sensors selected from among: lidar systems, image-capture devices, and radar systems.
10. The method of claim 9 , further comprising, using the location and trajectory of neighboring vehicles to increase confidence in the identified road-surface markings to follow.
11. The method of claim 6 , wherein accessing sensor data comprises accessing sensor data indicating one or more of speed limit information and stop sign information painted onto the road surface.
12. The method of claim 6 , wherein identifying the location and trajectories associated with neighboring vehicles on or near the road comprises identifying vehicles in the same lane as the vehicle.
13. The method of claim 6 , wherein identifying the location and trajectories associated with neighboring vehicles on or near the road comprises identifying vehicles in a different lane than the vehicle.
14. The method of claim 6 , wherein identifying the location and trajectories associated with neighboring vehicles comprises identifying vehicles traveling in the same direction as the vehicle.
15. The method of claim 6 , wherein identifying the location and trajectories associated with neighboring vehicles comprises identifying vehicles traveling in the opposite direction as the vehicle
16. A system for distinguishing lane markings for a vehicle to follow, the system comprising:
one or more processors;
system memory;
a plurality of sensors, the plurality of sensors including a first type of sensor and at least a second different type of sensor;
one or more computer storage devices having stored thereon computer-executable instructions that, when executed, cause the vehicle computer system to:
access sensor data which has been gathered by plurality of sensors, the sensor data indicating road-surface markings the road-surface markings including intensity of color and reflectivity, the sensor data also indicating the location and trajectory of neighboring vehicles;
determine bounding boxes from the accessed sensor data, the bounding boxes representing regions of interest for distinguishing road-surface markings and neighboring vehicles;
process the accessed sensor data within the bounding boxes to determine that multiple road-surface markings are present;
in response to determining that multiple road-surface markings are present:
utilize the sensor data to identify road-surface markings, from among the multiple road-surface markings, the vehicle is to follow;
use the location and trajectory of neighboring vehicles to increase confidence with respect to the identified road-surface markings; and
send the identified road-surface markings to the vehicle's decision making algorithms for use in controlling the vehicle's location and trajectory.
17. The system of claim 16 , wherein computer-executable instructions that, when executed, cause the vehicle computer system to access sensor data comprise computer-executable instructions that, when executed, cause the vehicle computer system to receive at least one type of data from one of the vehicle's sensors and at least a second, different type of data from a different type of the vehicle's sensors.
18. The system of claim 16 , further comprising computer-executable instructions at the vehicle computer system that, when executed, cause the computer system to query the different types of sensor data for road surface markings.
19. The system of claim 16 , further comprising computer-executable instructions at the vehicle computer system that, when executed, cause the computer system to query the different types of sensor data for neighboring vehicles.
20. The system of claim 16 , further comprising computer-executable instructions at the vehicle computer system that, when executed, cause the computer system to parse the sensor data for regions of interest, the regions of interest being bound by bounding boxes.
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/943,573 US9829888B2 (en) | 2015-11-17 | 2015-11-17 | Distinguishing lane markings for a vehicle to follow |
CN201610991318.XA CN107103272B (en) | 2015-11-17 | 2016-11-10 | Distinguishing lane markings to be followed by a vehicle |
DE102016121864.2A DE102016121864A1 (en) | 2015-11-17 | 2016-11-15 | Distinguish lane markers to be followed by a vehicle |
MX2016015049A MX2016015049A (en) | 2015-11-17 | 2016-11-16 | Distinguishing lane markings for a vehicle to follow. |
GB1619452.4A GB2546372A (en) | 2015-11-17 | 2016-11-17 | Distinguishing lane markings for a vehicle to follow |
US15/724,093 US10048691B2 (en) | 2015-11-17 | 2017-10-03 | Distinguishing lane markings for a vehicle to follow |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/943,573 US9829888B2 (en) | 2015-11-17 | 2015-11-17 | Distinguishing lane markings for a vehicle to follow |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/724,093 Continuation US10048691B2 (en) | 2015-11-17 | 2017-10-03 | Distinguishing lane markings for a vehicle to follow |
Publications (2)
Publication Number | Publication Date |
---|---|
US20170139417A1 true US20170139417A1 (en) | 2017-05-18 |
US9829888B2 US9829888B2 (en) | 2017-11-28 |
Family
ID=57993822
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/943,573 Active 2036-01-01 US9829888B2 (en) | 2015-11-17 | 2015-11-17 | Distinguishing lane markings for a vehicle to follow |
US15/724,093 Active US10048691B2 (en) | 2015-11-17 | 2017-10-03 | Distinguishing lane markings for a vehicle to follow |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/724,093 Active US10048691B2 (en) | 2015-11-17 | 2017-10-03 | Distinguishing lane markings for a vehicle to follow |
Country Status (5)
Country | Link |
---|---|
US (2) | US9829888B2 (en) |
CN (1) | CN107103272B (en) |
DE (1) | DE102016121864A1 (en) |
GB (1) | GB2546372A (en) |
MX (1) | MX2016015049A (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109318904A (en) * | 2017-07-31 | 2019-02-12 | 通用汽车环球科技运作有限责任公司 | The differentiation of Vehicular turn and lane changing |
US20190071080A1 (en) * | 2017-09-06 | 2019-03-07 | Denso Corporation | Driving assistance apparatus |
CN110709906A (en) * | 2017-06-06 | 2020-01-17 | 爱知制钢株式会社 | Marker system and application method |
DE102018120049A1 (en) * | 2018-08-17 | 2020-02-20 | Valeo Schalter Und Sensoren Gmbh | Method for recognizing driving lanes for a motor vehicle with selection of a temporary driving lane by a user on a display unit, as well as vehicle guidance system and computer program product |
US20200070816A1 (en) * | 2018-09-04 | 2020-03-05 | Caterpillar Paving Products Inc. | Systems and methods for operating a mobile machine using detected sounds |
CN112020722A (en) * | 2018-12-29 | 2020-12-01 | 北京嘀嘀无限科技发展有限公司 | Road shoulder identification based on three-dimensional sensor data |
US10937178B1 (en) * | 2019-05-09 | 2021-03-02 | Zoox, Inc. | Image-based depth data and bounding boxes |
US10984543B1 (en) | 2019-05-09 | 2021-04-20 | Zoox, Inc. | Image-based depth data and relative depth data |
US11087494B1 (en) | 2019-05-09 | 2021-08-10 | Zoox, Inc. | Image-based depth data and localization |
US20220003558A1 (en) * | 2018-10-04 | 2022-01-06 | Toyota Jidosha Kabushiki Kaisha | Map information system |
US11225256B2 (en) | 2018-09-11 | 2022-01-18 | Honda Motor Co., Ltd. | Vehicle control system and control method of vehicle |
US11403860B1 (en) * | 2022-04-06 | 2022-08-02 | Ecotron Corporation | Multi-sensor object detection fusion system and method using point cloud projection |
US20230103020A1 (en) * | 2021-09-29 | 2023-03-30 | Volkswagen Aktiengesellschaft | Method for laterally controlling a motor vehicle on a road having two lanes and motor vehicle |
US11823016B2 (en) * | 2019-05-29 | 2023-11-21 | Bank Of America Corporation | Optimized IoT data processing for real-time decision support systems |
FR3140051A1 (en) * | 2022-09-22 | 2024-03-29 | Psa Automobiles Sa | Methods and systems for detecting temporary traffic lane boundary lines |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11348066B2 (en) | 2013-07-25 | 2022-05-31 | IAM Robotics, LLC | System and method for piece picking or put-away with a mobile manipulation robot |
US10229363B2 (en) * | 2015-10-19 | 2019-03-12 | Ford Global Technologies, Llc | Probabilistic inference using weighted-integrals-and-sums-by-hashing for object tracking |
JP6919428B2 (en) * | 2016-12-21 | 2021-08-18 | トヨタ自動車株式会社 | Vehicle data recording device |
DE102017116016A1 (en) * | 2017-07-17 | 2019-01-17 | Valeo Schalter Und Sensoren Gmbh | A motor vehicle sensor device having a plurality of sensor units and a neural network for generating an integrated representation of an environment |
US10678241B2 (en) | 2017-09-06 | 2020-06-09 | GM Global Technology Operations LLC | Unsupervised learning agents for autonomous driving applications |
DE102017223206A1 (en) * | 2017-12-19 | 2019-06-19 | Robert Bosch Gmbh | Low-dimensional determination of demarcated areas and movement paths |
DE102018112513A1 (en) * | 2018-05-24 | 2019-11-28 | Daimler Ag | Method and system for deriving a trajectory at a system boundary of an autonomous vehicle by a teleoperator |
CN108820042B (en) * | 2018-05-25 | 2020-04-10 | 东软集团股份有限公司 | Automatic driving method and device |
CN111257866B (en) * | 2018-11-30 | 2022-02-11 | 杭州海康威视数字技术股份有限公司 | Target detection method, device and system for linkage of vehicle-mounted camera and vehicle-mounted radar |
WO2020115515A1 (en) * | 2018-12-07 | 2020-06-11 | Micron Technology, Inc. | Lane departure apparatus, system and method |
CN109816626A (en) * | 2018-12-13 | 2019-05-28 | 深圳高速工程检测有限公司 | Road surface crack detection method, device, computer equipment and storage medium |
CN111771206B (en) * | 2019-01-30 | 2024-05-14 | 百度时代网络技术(北京)有限公司 | Map partitioning system for an autonomous vehicle |
US11142196B2 (en) * | 2019-02-03 | 2021-10-12 | Denso International America, Inc. | Lane detection method and system for a vehicle |
WO2020206457A1 (en) | 2019-04-05 | 2020-10-08 | IAM Robotics, LLC | Autonomous mobile robotic systems and methods for picking and put-away |
US11887381B2 (en) * | 2021-04-30 | 2024-01-30 | New Eagle, Llc | Use of HCNN to predict lane lines types |
CN114537443B (en) * | 2022-03-22 | 2024-05-03 | 重庆长安汽车股份有限公司 | Longitudinal control system and method for driving and parking |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6311123B1 (en) * | 1999-06-28 | 2001-10-30 | Hitachi, Ltd. | Vehicle control method and vehicle warning method |
US20120010797A1 (en) * | 2010-07-07 | 2012-01-12 | Robert Bosch Gmbh | System and method for controlling the engine of a vehicle |
US20130079990A1 (en) * | 2011-09-28 | 2013-03-28 | Honda Research Institute Europe Gmbh | Road-terrain detection method and system for driver assistance systems |
US20130242102A1 (en) * | 2011-04-13 | 2013-09-19 | Nissan Motor Co., Ltd. | Driving assistance device and method of detecting vehicle adjacent thereto |
US20140136414A1 (en) * | 2006-03-17 | 2014-05-15 | Raj Abhyanker | Autonomous neighborhood vehicle commerce network and community |
US20140172221A1 (en) * | 2012-12-19 | 2014-06-19 | Volvo Car Corporation | Method and system for assisting a driver |
US20140267415A1 (en) * | 2013-03-12 | 2014-09-18 | Xueming Tang | Road marking illuminattion system and method |
US20140358420A1 (en) * | 2013-05-28 | 2014-12-04 | Hyundai Motor Company | Apparatus and method for detecting traffic lane using wireless communication |
US20160012300A1 (en) * | 2014-07-11 | 2016-01-14 | Denso Corporation | Lane boundary line recognition device |
US20160176358A1 (en) * | 2014-12-22 | 2016-06-23 | Volkswagen Ag | Early detection of turning condition identification using perception technology |
US20160327948A1 (en) * | 2015-05-08 | 2016-11-10 | Toyota Jidosha Kabushiki Kaisha | Misrecognition determination device |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10311240B4 (en) * | 2003-03-14 | 2017-03-02 | Robert Bosch Gmbh | Method and device for tracking a vehicle |
JP3898709B2 (en) | 2004-05-19 | 2007-03-28 | 本田技研工業株式会社 | Vehicle lane marking recognition device |
JP4659631B2 (en) | 2005-04-26 | 2011-03-30 | 富士重工業株式会社 | Lane recognition device |
JP4616068B2 (en) | 2005-04-28 | 2011-01-19 | 本田技研工業株式会社 | Vehicle, image processing system, image processing method, and image processing program |
US20100292895A1 (en) * | 2007-04-27 | 2010-11-18 | Aisin Aw Co. Ltd | Driving support device |
US8194927B2 (en) * | 2008-07-18 | 2012-06-05 | GM Global Technology Operations LLC | Road-lane marker detection using light-based sensing technology |
JP5066123B2 (en) * | 2009-03-24 | 2012-11-07 | 日立オートモティブシステムズ株式会社 | Vehicle driving support device |
FR2977957B1 (en) * | 2011-07-12 | 2016-07-01 | Inst Francais Des Sciences Et Tech Des Transp De L'amenagement Et Des Reseaux (Ifsttar) | IMAGING DEVICE AND METHOD FOR PRODUCING AN IMAGE OF ROAD MARKING |
DE102012224125B4 (en) * | 2012-01-02 | 2024-03-07 | Ford Global Technologies, Llc | Method for lane keeping support for a driver of a motor vehicle and lane keeping assistance system |
DE102012004791A1 (en) * | 2012-03-07 | 2013-09-12 | Audi Ag | A method for warning the driver of a motor vehicle of an imminent danger situation as a result of unintentional drifting on an oncoming traffic lane |
JP5926080B2 (en) * | 2012-03-19 | 2016-05-25 | 株式会社日本自動車部品総合研究所 | Traveling lane marking recognition device and program |
US8504233B1 (en) | 2012-04-27 | 2013-08-06 | Google Inc. | Safely navigating on roads through maintaining safe distance from other vehicles |
GB201209972D0 (en) | 2012-06-06 | 2012-07-18 | Univ Warwick | Organic electron acceptor compounds |
US9053372B2 (en) | 2012-06-28 | 2015-06-09 | Honda Motor Co., Ltd. | Road marking detection and recognition |
DE112013004267T5 (en) * | 2012-08-30 | 2015-06-25 | Honda Motor Co., Ltd. | Road marking recognition device |
US9026300B2 (en) * | 2012-11-06 | 2015-05-05 | Google Inc. | Methods and systems to aid autonomous vehicles driving through a lane merge |
WO2014128532A1 (en) * | 2013-02-25 | 2014-08-28 | Continental Automotive Gmbh | Intelligent video navigation for automobiles |
JP5977275B2 (en) * | 2014-02-14 | 2016-08-24 | 株式会社日本自動車部品総合研究所 | Branch recognition device |
-
2015
- 2015-11-17 US US14/943,573 patent/US9829888B2/en active Active
-
2016
- 2016-11-10 CN CN201610991318.XA patent/CN107103272B/en active Active
- 2016-11-15 DE DE102016121864.2A patent/DE102016121864A1/en active Pending
- 2016-11-16 MX MX2016015049A patent/MX2016015049A/en unknown
- 2016-11-17 GB GB1619452.4A patent/GB2546372A/en not_active Withdrawn
-
2017
- 2017-10-03 US US15/724,093 patent/US10048691B2/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6311123B1 (en) * | 1999-06-28 | 2001-10-30 | Hitachi, Ltd. | Vehicle control method and vehicle warning method |
US20140136414A1 (en) * | 2006-03-17 | 2014-05-15 | Raj Abhyanker | Autonomous neighborhood vehicle commerce network and community |
US20120010797A1 (en) * | 2010-07-07 | 2012-01-12 | Robert Bosch Gmbh | System and method for controlling the engine of a vehicle |
US20130242102A1 (en) * | 2011-04-13 | 2013-09-19 | Nissan Motor Co., Ltd. | Driving assistance device and method of detecting vehicle adjacent thereto |
US20130079990A1 (en) * | 2011-09-28 | 2013-03-28 | Honda Research Institute Europe Gmbh | Road-terrain detection method and system for driver assistance systems |
US20140172221A1 (en) * | 2012-12-19 | 2014-06-19 | Volvo Car Corporation | Method and system for assisting a driver |
US20140267415A1 (en) * | 2013-03-12 | 2014-09-18 | Xueming Tang | Road marking illuminattion system and method |
US20140358420A1 (en) * | 2013-05-28 | 2014-12-04 | Hyundai Motor Company | Apparatus and method for detecting traffic lane using wireless communication |
US20160012300A1 (en) * | 2014-07-11 | 2016-01-14 | Denso Corporation | Lane boundary line recognition device |
US20160176358A1 (en) * | 2014-12-22 | 2016-06-23 | Volkswagen Ag | Early detection of turning condition identification using perception technology |
US20160327948A1 (en) * | 2015-05-08 | 2016-11-10 | Toyota Jidosha Kabushiki Kaisha | Misrecognition determination device |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110709906A (en) * | 2017-06-06 | 2020-01-17 | 爱知制钢株式会社 | Marker system and application method |
CN109318904A (en) * | 2017-07-31 | 2019-02-12 | 通用汽车环球科技运作有限责任公司 | The differentiation of Vehicular turn and lane changing |
US20190071080A1 (en) * | 2017-09-06 | 2019-03-07 | Denso Corporation | Driving assistance apparatus |
US11577724B2 (en) * | 2017-09-06 | 2023-02-14 | Denso Corporation | Driving assistance apparatus |
DE102018120049A1 (en) * | 2018-08-17 | 2020-02-20 | Valeo Schalter Und Sensoren Gmbh | Method for recognizing driving lanes for a motor vehicle with selection of a temporary driving lane by a user on a display unit, as well as vehicle guidance system and computer program product |
WO2020035312A1 (en) | 2018-08-17 | 2020-02-20 | Valeo Schalter Und Sensoren Gmbh | Method for recognizing driving lanes for a motor vehicle with selection of a temporary driving lane by a user on a display unit, as well as vehicle guidance system and computer program product |
US20200070816A1 (en) * | 2018-09-04 | 2020-03-05 | Caterpillar Paving Products Inc. | Systems and methods for operating a mobile machine using detected sounds |
US10800409B2 (en) * | 2018-09-04 | 2020-10-13 | Caterpillar Paving Products Inc. | Systems and methods for operating a mobile machine using detected sounds |
US11225256B2 (en) | 2018-09-11 | 2022-01-18 | Honda Motor Co., Ltd. | Vehicle control system and control method of vehicle |
US11719555B2 (en) * | 2018-10-04 | 2023-08-08 | Toyota Jidosha Kabushiki Kaisha | Map information system |
US20220003558A1 (en) * | 2018-10-04 | 2022-01-06 | Toyota Jidosha Kabushiki Kaisha | Map information system |
CN112020722A (en) * | 2018-12-29 | 2020-12-01 | 北京嘀嘀无限科技发展有限公司 | Road shoulder identification based on three-dimensional sensor data |
US11087494B1 (en) | 2019-05-09 | 2021-08-10 | Zoox, Inc. | Image-based depth data and localization |
US10984543B1 (en) | 2019-05-09 | 2021-04-20 | Zoox, Inc. | Image-based depth data and relative depth data |
US10937178B1 (en) * | 2019-05-09 | 2021-03-02 | Zoox, Inc. | Image-based depth data and bounding boxes |
US11748909B2 (en) | 2019-05-09 | 2023-09-05 | Zoox, Inc. | Image-based depth data and localization |
US11823016B2 (en) * | 2019-05-29 | 2023-11-21 | Bank Of America Corporation | Optimized IoT data processing for real-time decision support systems |
US20230103020A1 (en) * | 2021-09-29 | 2023-03-30 | Volkswagen Aktiengesellschaft | Method for laterally controlling a motor vehicle on a road having two lanes and motor vehicle |
US11403860B1 (en) * | 2022-04-06 | 2022-08-02 | Ecotron Corporation | Multi-sensor object detection fusion system and method using point cloud projection |
FR3140051A1 (en) * | 2022-09-22 | 2024-03-29 | Psa Automobiles Sa | Methods and systems for detecting temporary traffic lane boundary lines |
Also Published As
Publication number | Publication date |
---|---|
US9829888B2 (en) | 2017-11-28 |
GB201619452D0 (en) | 2017-01-04 |
MX2016015049A (en) | 2017-08-10 |
CN107103272A (en) | 2017-08-29 |
US20180024560A1 (en) | 2018-01-25 |
DE102016121864A1 (en) | 2017-05-18 |
CN107103272B (en) | 2022-06-28 |
GB2546372A (en) | 2017-07-19 |
US10048691B2 (en) | 2018-08-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10048691B2 (en) | Distinguishing lane markings for a vehicle to follow | |
US10077054B2 (en) | Tracking objects within a dynamic environment for improved localization | |
CN107644197B (en) | Rear camera lane detection | |
US11967230B2 (en) | System and method for using V2X and sensor data | |
CN108571974B (en) | Vehicle positioning using a camera | |
CN107284443B (en) | Detecting available parking spaces | |
US20200307589A1 (en) | Automatic lane merge with tunable merge behaviors | |
CN106778548B (en) | Method and apparatus for detecting obstacles | |
US20230102802A1 (en) | Map change detection | |
GB2560625A (en) | Detecting vehicles in low light conditions | |
US11913791B2 (en) | Crosswalk detection | |
CN111145569A (en) | Road monitoring and vehicle running control method and device and vehicle-road cooperative system | |
US20210405641A1 (en) | Detecting positioning of a sensor system associated with a vehicle | |
EP4095009B1 (en) | Method and device for operating a self-driving car | |
US20200257907A1 (en) | Calibration of fixed image-capturing device for depth estimation | |
Park | Implementation of lane detection algorithm for self-driving vehicles using tensor flow | |
EP4145420A1 (en) | Hierarchical processing of traffic signal face states | |
CN108022250B (en) | Automatic driving processing method and device based on self-adaptive threshold segmentation | |
US20240200957A1 (en) | Electronic device for detecting vehicle driving behavior multi dimensionally and method thereof | |
US20240246570A1 (en) | Path planning system and path planning method thereof | |
CN116993885A (en) | Road scene rendering method and device, electronic equipment and storage medium | |
WO2021068210A1 (en) | Method and apparatus for monitoring moving object, and computer storage medium | |
US20200258256A1 (en) | Calibration of fixed image-capturing device for depth estimation | |
JPWO2019057252A5 (en) | ||
CN117944713A (en) | Automatic driving method, device, domain controller, medium, system and vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FORD GLOBAL TECHNOLOGIES, LLC, MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:REIFF, BRIELLE;JAIN, JINESH J;KADETOTAD, SNEHA;SIGNING DATES FROM 20151102 TO 20151106;REEL/FRAME:037061/0981 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |