US20200324764A1 - Vehicular control system with pedestrian avoidance - Google Patents

Vehicular control system with pedestrian avoidance Download PDF

Info

Publication number
US20200324764A1
US20200324764A1 US16/946,434 US202016946434A US2020324764A1 US 20200324764 A1 US20200324764 A1 US 20200324764A1 US 202016946434 A US202016946434 A US 202016946434A US 2020324764 A1 US2020324764 A1 US 2020324764A1
Authority
US
United States
Prior art keywords
vehicle
equipped vehicle
control
pedestrian
path
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US16/946,434
Inventor
Nathaniel S. Johnson
Christopher L. Van Dan Elzen
Christoph Klas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Magna Electronics Inc
Original Assignee
Magna Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magna Electronics Inc filed Critical Magna Electronics Inc
Priority to US16/946,434 priority Critical patent/US20200324764A1/en
Publication of US20200324764A1 publication Critical patent/US20200324764A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • B60W30/12Lane keeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0027Planning or execution of driving tasks using trajectory prediction for other traffic participants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/04Conjoint control of vehicle sub-units of different type or different function including control of propulsion units
    • B60W10/06Conjoint control of vehicle sub-units of different type or different function including control of propulsion units including control of combustion engines
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • B60W10/184Conjoint control of vehicle sub-units of different type or different function including control of braking systems with wheel brakes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • B60W30/16Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • B60W2420/42
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4029Pedestrians
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4041Position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/406Traffic density
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/801Lateral distance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/803Relative lateral speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/804Relative longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2720/00Output or target parameters relating to overall vehicle dynamics
    • B60W2720/10Longitudinal speed

Definitions

  • the present invention relates to imaging systems or vision systems for vehicles.
  • the present invention provides a vision system or imaging system for a vehicle that utilizes one or more cameras to capture images exterior of the vehicle (such as forwardly and rearwardly of the vehicle), and provides the communication/data signals, including camera data or image data, that may be displayed at a display screen that is viewable by the driver of the vehicle, such as when the driver is backing up the vehicle, and that may be processed and, responsive to such image processing, the system may detect an object at or near the vehicle and in the path of travel of the vehicle, such as when the vehicle is backing up.
  • the vision system may be operable to display a surround view or bird's eye view of the environment at or around or at least partially surrounding the subject or equipped vehicle.
  • a vision system for a vehicle includes one or more cameras or image sensors disposed at a vehicle and having respective fields of view exterior of the vehicle, and an image processor operable to process data transmitted by the cameras.
  • the vision system includes a forward facing camera module (having image processing circuitry incorporated therein) and also include a rearward facing vision camera (for capturing video image data that is displayed on a display of the vehicle for viewing by the driver of the vehicle during a reversing maneuver) and/or sideward facing cameras.
  • the vision system may provide a variety of functions by utilizing captured image data from one or more of the cameras at the vehicle, such as a forward viewing camera, a rearward viewing camera, side viewing cameras and/or a forward viewing windshield mounted camera (having a field of view through the windshield of the vehicle).
  • the vision system may have a front windshield camera module that may have image data processing capabilities for that camera and for one or more other cameras of the vehicle, or multiple cameras (such as a forward viewing camera at a forward portion of the vehicle, a rearward viewing camera, side viewing cameras, a forward viewing camera that views through a windshield of the vehicle, and optionally a night vision camera) may feed into a common image data processing module.
  • the vision system of the present invention may be operable to determine (such as via image processing of captured image data and via a speed of the vehicle) when the vehicle is traveling in traffic, such as high volume traffic, a traffic jam situation or the like, and a controller or control system may control or autonomously drive the vehicle during such traffic conditions.
  • the system may determine when a lane splitting vehicle (such as a motorcycle or motor scooter or bicycle or other small vehicle) is driving between lanes of traffic (commonly referred to as lane splitting) and may control the subject vehicle accordingly. For example, when the system detects a vehicle or motorcycle approaching (such as from behind the vehicle or ahead of the vehicle) at the left side, the system may move the subject vehicle towards the right side of the subject vehicle's lane or occupied lane and away from the lane splitting motorcycle, while still remaining in the occupied lane. After the detected motorcycle passes, the system may move the subject vehicle back towards the center of the occupied lane. Also, responsive to detection of a lane splitting motorcycle or the like, the system can limit or prevent lane change if such a lane change would result in collision with the lane splitting motorcycle.
  • a lane splitting vehicle such as a motorcycle or motor scooter or bicycle or other small vehicle
  • the system may be operable to detect pedestrians and may slow or stop to allow for pedestrians to cross the road in front of the vehicle as may occur in high volume traffic situations and/or fast or slow moving traffic situations, such as in a crowded city street or the like.
  • the control system at least in part responsive to detection of a stationary pedestrian in the path of travel of the equipped vehicle, may be operable to stop the equipped vehicle.
  • the control system at least in part responsive to detection of a moving pedestrian in the path of travel of the equipped vehicle, may be operable to slowly move the equipped vehicle forward at a speed that allows the pedestrian time to move out of the path of travel of the equipped vehicle.
  • the system thus is operable to determine the driving condition or traffic condition of the subject vehicle and, when that determined condition is indicative of traffic, such as high volume traffic or slow moving traffic or a traffic jam, the system may control the subject vehicle to drive the subject vehicle in the traffic.
  • the system detects the surrounding vehicles and determines the appropriate driving direction and speed for the subject vehicle to move the subject vehicle with the traffic flow and to adapt the driving of the subject vehicle to the surrounding vehicles and traffic flow.
  • the system is operable to detect pedestrians and may slow or stop to allow for pedestrians to cross the road in front of the vehicle as may occur in high volume traffic situations and/or fast or slow moving traffic situations, such as in a crowded city street or the like.
  • FIG. 2 is a flow chart of the vision system control of an equipped vehicle in accordance with the present invention, showing vehicle control in a lane splitting situation;
  • FIG. 4 is a flow chart of the vision system control of an equipped vehicle in accordance with the present invention, showing vehicle control in an expanding lane situation;
  • FIG. 5 is another schematic of how the vision system controls equipped vehicles in accordance with the present invention, showing vehicle control in an expanding lane situation;
  • FIG. 6 is another schematic of how the vision system controls equipped vehicles in accordance with the present invention, showing vehicle control in a lane merging situation when the equipped vehicle accepts another vehicle's attempt to merge ahead of the equipped vehicle;
  • FIG. 7 is another schematic of how the vision system controls equipped vehicles in accordance with the present invention, showing vehicle control in a lane merging situation when the equipped vehicle rejects another vehicle's attempt to merge ahead of the equipped vehicle;
  • FIG. 8 is another schematic of how the vision system controls equipped vehicles in accordance with the present invention, showing vehicle control in pedestrian situations, where the vehicle stops when a non-moving pedestrian is in the immediate path of travel of the vehicle;
  • FIG. 9 is another schematic of how the vision system controls equipped vehicles in accordance with the present invention, showing vehicle control in pedestrian situations, where the vehicle moves slowly forward when moving pedestrians are in the path of travel of the vehicle or have a trajectory which could cross the path of the vehicle;
  • FIG. 10 is a schematic of the functional structure of the traffic assist system of the present invention.
  • FIG. 11 is an image showing an overlay of object selection and path planning data for use in system analysis
  • FIG. 12 is an example of the operation of the traffic assist system of the present invention.
  • FIGS. 13 and 14 are schematics of how the vision system of the present invention controls the equipped vehicle when another vehicle wants to pull out in front of or behind the equipped vehicle, such as when the equipped vehicle is blocking or partially blocking a driveway or the like;
  • FIG. 15 is a schematic of the functional principle of the lateral controller of the system of the present invention.
  • FIG. 16 is a schematic of the functional principle of the longitudinal controller of the system of the present invention.
  • a vehicle vision system and/or driver assist system and/or object detection system and/or alert system and/or control system and/or autonomous vehicle control system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction.
  • the vision system includes a processor that is operable to receive image data from the vehicle cameras and may provide a displayed image that is representative of the subject vehicle (such as for a top down or bird's eye or surround view, such as discussed below).
  • the vision and display system may utilize aspects of the systems described in U.S. Pat. Nos.
  • the vision system may have a front windshield camera module that may have image data processing capabilities for that camera and for one or more other cameras of the vehicle, or multiple cameras (such as a forward viewing camera at a forward portion of the vehicle, a rearward viewing camera, side viewing cameras, a forward viewing camera that views through a windshield of the vehicle, and optionally a night vision camera) may feed into a common image data processing module, such as by utilizing aspects of the vision systems described in U.S. patent application Ser. No. 13/894,870, filed May 15, 2013, and published Nov. 28, 2013 as U.S. Publication No. US-2013-0314503, which is hereby incorporated herein by reference in its entirety.
  • the vision system 12 includes a control or processor 18 that is operable to process image data captured by the cameras and may provide displayed images at a display device 16 for viewing by the driver of the vehicle (although shown in FIG. 1 as being part of or incorporated in or at an interior rearview mirror assembly 20 of the vehicle, the control and/or the display device may be disposed elsewhere at or in the vehicle).
  • the vision system may also operate in conjunction with other sensors of the vehicle, such as RADAR sensors or LIDAR sensors or Time-of-Flight (TOF) sensors or Ultrasonic sensors or the like.
  • the system thus may be operable to provide enhanced detection of objects or other vehicles at or near the subject or equipped vehicle and may determine the distance to the objects or other vehicles and the speed and directional heading of the detected objects or other vehicles relative to the equipped vehicle.
  • the system of the present invention is operable to provide a driver assist or traffic jam assist function (providing lateral and longitudinal control in pedestrian and/or traffic scenarios (low speed, controlled access road)).
  • the vision system of the present invention may provide various features, such as, for example, a full autonomous driving function including autonomous lane change to overtake slower cars, construction area driving and lane merges, an autonomous pull-over maneuver function in case of an incapacitated and/or unresponsive driver, an automatic trailer hookup function (which is operable to guide the vehicle to a trailer), an automatic cruise control (ACC) automatic go in a stop and go ACC (such as for city driving conditions/environments), an enhanced automatic emergency braking (AEB) function based on rear traffic (optionally, for example, overriding or not braking or delaying braking if rear traffic is present), a blind spot detection function (to limit or prevent accidents during lane change maneuvers), an onramp assist function (to predict whether the equipped vehicle can accelerate enough to merge with existing traffic before the end of the onramp), a
  • the system of the present invention is thus operable to detect traffic behaviors, and may control the equipped vehicle to maneuver the vehicle with the traffic flow in a manner that provides safe travel and that controls the vehicle so that the vehicle is driven in a similar manner as the other vehicles on the road, in order to enhance traffic flow.
  • the system may use any suitable processing means or protocol to determine the traffic conditions and to detect vehicles and/or pedestrians on or off the road being traveled by the equipped vehicle and on or off cross roads and merging roads at or near the traveled road.
  • the system may learn or adapt the driving or control of the vehicle (such as during the driving or control of the vehicle or before taking control of the vehicle) responsive to the driving or maneuvering of other vehicles on the road.
  • the vision system or control system of the present invention is operable, at least when in a traffic situation, to detect a lane splitting vehicle, such as a motorcycle cutting through slow moving traffic and between vehicles traveling along two adjacent lanes, and to control the equipped vehicle accordingly.
  • the system may utilize a side camera and/or a rear camera and/or a blind spot radar sensor or the like to detect the presence or approach of a lane splitting vehicle (such as in conjunction with a lane marker detection so the system is aware of the lane delineations for the lane in which the equipped vehicle is travelling).
  • a lane splitting vehicle 30 FIG.
  • the control system may steer the vehicle to the right, while remaining within the lane that the equipped vehicle occupies (in other words, moving the vehicle towards the right side lane marker but without making a lane change). Such movement allows for more room for the lane splitting vehicle to pass at the left side more safely.
  • the system may control the equipped vehicle to move back towards the center of the occupied lane, and then may travel with the traffic flow along the occupied lane.
  • the system may determine when vehicle travel or vehicle “lanes” are different from the lanes marked on the road, and may control the equipped vehicle to follow one of the leading vehicles.
  • the control system may detect an increase in traffic or a shift in traffic ahead of the equipped vehicle (such as when there are more lanes of vehicles than road lanes) and may determine which path or line of vehicles to follow.
  • the system may select a faster moving line of vehicles or a particular side or direction (such as, for example, the right side line of vehicles when the equipped vehicle is approaching an exit or right turn along its selected or predetermined route), and may control or steer the vehicle to follow the vehicles of the selected line of vehicles.
  • the system thus selects or chooses a target vehicle to follow, which may be a faster vehicle and/or may be partially occupying the road lane that the equipped vehicle is traveling, and such a selection may be based at least partially on the intent of a vehicle adjacent to the equipped vehicle. For example, and as shown in FIG. 5 , if the equipped vehicle 40 selects a vehicle 50 to the left to follow, but a left side adjacent vehicle 60 is moving to follow that vehicle (or otherwise indicates that it intends to follow that vehicle), the system may select a different vehicle to follow or may adjust the driving to fall in behind the adjacent vehicle or the like.
  • the system overrides any lane departure warning system or alert and steers the equipped vehicle outside of its occupied lane and may even continue to drive the vehicle along a lane marker and thus not in any marked lane during such a traffic condition, and may even drive the vehicle partially onto or fully onto a shoulder of the road to follow the selected line of vehicles.
  • the system may also utilize a navigation system and/or pavement detection or the like to make sure that the equipped vehicle stays on its intended or selected course or route when following vehicles outside of the road lanes. The system may alert the driver that the vehicle or system is entering this special driving mode before entering or commencing that mode and during the out of lane maneuvers.
  • the system may determine when a gap between consecutive vehicles (a leading vehicle and a trailing vehicle following the leading vehicle along the same or similar path and/or in the same lane of traffic) in an adjacent lane is sufficient to begin moving over and into the adjacent lane, and the system may steer the vehicle towards and into that gap (and may actuate the turn signal accordingly, if such turn signal use is appropriate), such as in a manner that allows the following vehicle behind the gap to slow to allow the equipped vehicle to enter the gap.
  • the system may be operable to determine when the other vehicles do not allow such a lane change maneuver (such as when the other vehicle does not slow down to allow the lane change maneuver), and may return the vehicle to its lane and try again at a later gap.
  • the system may adjust its control or sensitivities responsive to the geographical location and/or driving behavior of the other vehicle drivers, and may learn or adapt responsive to the current driving conditions and driving behaviors.
  • the system may determine when another vehicle driver wants to cut into the lane of the equipped vehicle ahead of the equipped vehicle. Responsive to such a determination, the system may (such as shown in FIG. 6 ) slow or stop the equipped vehicle 40 to allow for the other vehicle 60 to cut in, or may (such as shown in FIG. 7 ) move the equipped vehicle 40 to the side to reject the cut in but partially allow the other vehicle 60 to continue traveling (where eventually the other vehicle should slow and fall in behind the equipped vehicle), depending on the particular driving conditions and traffic flow and traffic situation.
  • the system may adapt or calibrate its sensitivity or processing so that, after one vehicle cuts in, the system is less tolerant of other vehicles cutting in as well, to avoid a potential situation where the system stops the vehicle and allows a steady stream of other vehicles to cut in ahead of the equipped vehicle.
  • the system may stop the vehicle or maneuver the vehicle so as to exclude any and all paths that are occupied or partially occupied by one or more stationary pedestrians.
  • a stationary or non-moving pedestrian 72 (such as shown in FIG. 8 )
  • the system may stop the vehicle to avoid collision with the non-moving pedestrian.
  • moving pedestrians 70 such as pedestrians crossing the road through the traffic
  • the system may determine a predicted path of the pedestrian or pedestrians and may maneuver the vehicle or slow the vehicle to make sure that the equipped vehicle avoids any conflict or potential conflict or collision with the crossing pedestrian.
  • the vision system may utilize rear image processing for lane detection.
  • the system may apply lane detection and tracking aspects from front image processing to rear images captured by one or more rearward facing cameras of the vehicle.
  • the system may detect the lane markings and may determine the lateral distance to a left or right lane marking, and may control steering and/or provide an alert to the driver responsive to the detected distance to the lane markings.
  • the system may utilize the rearward lane marking detection to provide enhanced detection of a lane splitting vehicle or motorcycle or scooter.
  • the system thus provides increased availability of lane information to the driver, and may warn the driver even where lane departure prevention (LDP) from the front camera may not be available, such as in low lighting conditions or situations, traffic jams (when preceding vehicles block lane markings), tunnel entry and/or the like.
  • LDP lane departure prevention
  • the rear lane detection of the present invention may be used for autonomous driving/lane keeping where high lane data availability is important.
  • the vision system of the present invention may be operable to provide other various functions.
  • the vision system may operate with or be associated with an adaptive automatic emergency braking (AEB) system of the vehicle, such that, when the subject vehicle determines that braking is desired or appropriate, the subject vehicle may, for example, brake earlier and/or harder, if no vehicle is following (as determined by image processing of the image data captured by the rearward facing camera), and risk of rear end collision is low, or may brake later, if the vision system determines that a vehicle is following, and the risk of rear end collision is higher.
  • the vision system may provide improved or enhanced lane detection at low sun/and increased availability of LDW based on detecting lanes in the rear camera images.
  • the vision system may provide rear pedestrian detection, and may provide a warning or may brake if a pedestrian is detected in the rear images, such as during a reversing maneuver of the vehicle.
  • the vision system may provide a rear object detection, and may provide a warning or the like if a general object is detected in the rear images.
  • the vision system may provide a rear cross traffic alert function, and may detect or determine crossing vehicles and may be operable to alert or warn the driver of the subject vehicle of crossing vehicles when the driver is reversing or backing up the subject vehicle.
  • the system of the present invention may utilize aspects of the systems described in U.S. patent application Ser. No. 13/894,870, filed May 15, 2013, and published Nov. 28, 2013 as U.S. Publication No. US-2013-0314503, which is hereby incorporated herein by reference in its entirety.
  • the present invention may also or otherwise provide enhanced control or semi-autonomous driving features to assist the driver of the vehicle during high traffic conditions or situations, such as during a commute to or from work at rush hour type traffic.
  • Commuter mobility has been constantly increasing over the past decades. The number of accidents shows the same tendency and is not decreasing significantly although new active and passive safety features are introduced in modern vehicles.
  • Automated driving has the potential to improve the safety and productive time of commuters. However, many automated driving functions in development are equipped with expensive sensors. For automated driving to become affordable to the public it must be reduced in cost.
  • the present invention provides a limited semi-autonomous driving system for traffic jam situations that may operate with only a single mono-camera sensor (although clearly the semi-autonomous driving assist system of the present invention may operate using multiple cameras of the vehicle).
  • the assist system is focused on relieving the driver of the mundane task of driving in heavy traffic. It is capable of operating in a low speed range and it does not require any additional HMI or actuation beyond what is already available in vehicles with Lane Keep Assist or Cruise Control or the like.
  • the present invention utilizes an algorithm that achieves a partially automated driving function while providing a cost-effective hardware setup.
  • the environment sensing may be performed by a monocular camera only and the software may work on a single ECU.
  • the traffic driving assist system of the present invention may decrease the driver's workload in monotonous situations in congested traffic.
  • the driver or user can activate the driver assist function in a slow traffic situation to relieve monotony.
  • optional preconditions for activation of the system may include the availability of lane markings and existence of a preceding vehicle within a certain distance.
  • the user may activate the system during a traffic condition by activating a user input, such as by pressing a button or the like. After this, the driver no longer needs to provide steering, accelerator or brake pedal input while in the traffic condition.
  • the driver assist function will take over the lateral guidance of the equipped vehicle and keep it generally centered in the lane.
  • the longitudinal behavior is controlled by the system to follow the closest target vehicle in the same lane (Same Lane Target), maintaining an appropriate amount of distance or gap between the target vehicle and the equipped vehicle. If the target vehicle stops then the equipped vehicle will come to a stop close behind the target vehicle and will resume driving if the target vehicle then continues, all without any further input needed from the driver of the equipped vehicle.
  • the system may be operable to warn the driver and request driver takeover of the vehicle.
  • the system may continue to control the vehicle in the safest manner possible with limited information until the driver takes over control of the vehicle. This provides time for a possibly inattentive driver to react to the system request or alert. If the driver does not act within a certain amount of time following the warning or alert, the system may disengage or enter a safe state and gently slow the vehicle to a stop.
  • the system requirements and operational parameters may be derived from an analysis of traffic jam data collected with human drivers.
  • the velocity range in which the system may operate may be defined to allow automated driving in most traffic jam scenarios, such as at speeds below about 70 km/hr (data has shown that 99 percent of the traffic jam driving speed range is below 70 km/hr).
  • the maximum speed at which the system may control the vehicle may be set to about 72 km/hr (or about 45 mph).
  • Traffic data shows that acceleration values occurring in congested traffic are mostly below about 1.5 m/s 2 , so the limit of the system may be set to this value, especially to allow quick starting after standstill without opening a large gap to the preceding vehicle.
  • a threshold of about ⁇ 1.3 m/s 2 is reasonable, since most of the situations can be handled within this range.
  • the global minimum of the allowed deceleration was set to 4 m/s 2 to also enable the system to react to more critical situations, such as, for example, close cut-in maneuvers or the like.
  • AEB Automatic Emergency Braking
  • a minimum radius of curvature of the road trajectory may be defined to be, for example, about 180 m.
  • the maximum duration of system operation without any driver interaction may be restricted in order to insure that the driver's attention will not drift and that he or she is still alert. This time period may be set to, for example, about 120 seconds before a driver reaction is actually requested by the system.
  • the global structure of the architecture allows for modularity.
  • the same code can therefore be used in different setups or vehicles.
  • the functional structure of the system of the present invention is shown in FIG. 10 .
  • the Path Planning module calculates the desired behavior based on the sensor data.
  • the Target Object Selection module extracts and flags the traffic objects which are relevant for the decision.
  • the actual longitudinal and lateral pathing is defined.
  • the global system status is determined within a state machine. This status determines if the system is enabled and the function is ready to be activated. Because the lateral and longitudinal guidance can be active at the same time or independently there, are states for all three possibilities. Additionally, a Safe Mode state may be implemented for the case that a safe operation cannot be guaranteed, such as, for example, when there is an extended period of driver unresponsiveness.
  • the situation is classified based on environmental, ego-vehicle or subject vehicle or equipped vehicle and path planning data. Class parameters are adapted according to the detected situation. Because the vehicle dynamics in both lateral and longitudinal dimensions are highly dependent on the actual velocity of the vehicle, the control parameters are also continuously adjusted in relation to the SV's current speed.
  • the two controllers for longitudinal and lateral guidance generate the actual control signals that are sent to the vehicle interface.
  • the lateral controller is implemented as a state controller and the longitudinal guidance is implemented as a sliding-mode controller.
  • the task of the Path Planning module is to take into account the environmental information provided by sensor perception and generate the trajectory.
  • the preceding vehicle in the same lane usually has the most significant influence on the longitudinal trajectory.
  • the closest target in the left adjoining lane is also relevant if it is closer and moving more slowly than the SALT.
  • the selection of all these relevant targets and calculation of their trajectories is the task of the Target Object Selection.
  • An example for tagged objects is shown in FIG. 11 . Also, some of the calculated output of the Path Planning module can be seen at the lower region of FIG. 11 .
  • the lane position is extrapolated based on the approximation:
  • y tgt is the lateral position of the lane marking at the longitudinal position x tgt of the target
  • r lane is the relative heading angle between SV and the lane
  • K lane is the curvature of the lane.
  • the path may be adapted to increase lateral separation. If only a single marking can be successfully tracked then the detected lane's information is mirrored for a certain time period under the assumption that the width of the lane is not changing quickly. After this duration, the virtual lane width is slowly reduced to a minimum to allow lateral guidance as long as possible. In the event that no lane markings are available or the distance to the SALT is small at a low velocity, the lateral path is based on the track of the SALT.
  • the value also has some safety distance added, compared to human drivers who stop at a distance of about 2.5 m on average.
  • a Slow Mode condition has been implemented to treat the low speed approach to a slow moving preceding vehicle. It is triggered if:
  • x follow x stop,min
  • the system may generate an alert or warning flag, issued by the image processor, to report a close cut-in.
  • the controller responds by commanding a gentle deceleration as a first reaction.
  • the step in the measured distance to the relevant target x tgt (such as at 36 seconds) reveals the first tracking of the merging vehicle.
  • the full deceleration is sent to the interface.
  • the lateral and longitudinal controllers' task is to output the particular commands to the vehicle interfaces. Lateral and longitudinal guidance are intended to also be working separately. Thus, two parallel controllers are preferred to an integrated concept.
  • the driving situation is classified. This allows an adaptation of the characteristics of the controller to provide a comfortable and safe vehicle motion without any high acceleration or jerk in either dimension.
  • the vision system of the present invention may be operable to detect when the subject vehicle or equipped vehicle is not moving or moving slowly and is blocking a driveway or intersection and there is another vehicle that is trying to pull out into the lane occupied by the subject vehicle.
  • the system may stop the subject vehicle to allow the other vehicle to pull into the lane ahead of the subject vehicle or the system may move the subject vehicle forward to allow sufficient space behind the subject vehicle for the other vehicle to at least partially pull out into the lane behind the subject vehicle. For example, and as shown in FIG.
  • the system may apply the vehicle brakes to hold the vehicle's position to allow the other vehicle to pull into the lane of traffic ahead of the subject vehicle.
  • the system may also flash the head lights or honk the horn or provide some other signal to indicate to the driver of the other vehicle that they can proceed ahead of the subject vehicle.
  • the system may control the vehicle brakes and accelerator to creep forward to allow for the other vehicle 80 to pull into the lane of traffic behind the subject vehicle.
  • the system may provide a resting distance or gap between a leading vehicle, such as about five meters or thereabouts, so creeping forward a little would take up some of the resting distance or gap, while still spacing the subject vehicle from the vehicle ahead of the subject vehicle.
  • the system may allow the subject vehicle to creep forward about 2.5 meters or thereabouts (even though the leading vehicle has not moved forward or has moved forward less than that amount), which would allow sufficient room for the other vehicle to pull into or at least start to pull into the lane behind the crept forward subject vehicle (while still leaving a safe gap between the subject vehicle and the leading vehicle). It would then be up to the vehicle initially behind the subject vehicle to stay put and allow the other vehicle to pull into the lane ahead of them.
  • the system of the present invention when the subject vehicle is in the trailing vehicle position (where a leading vehicle 82 ( FIG. 13 ) creeps forward to make room for another vehicle to pull in behind the leading vehicle), the system may determine when the leading vehicle creeps forward and may apply the brakes so as to not follow the forward movement, so that the other vehicle has sufficient room to pull into the lane ahead of the subject vehicle.
  • the concept is based on a state space representation of the controlled system.
  • the state of the system is expressed by a vector of quantities, which are significant for the dynamic behavior of the system.
  • an expansion of the linear bicycle model leads to a 5-element state vector:
  • the Situation Adaptation is represented by the Parameter Scheduling module, which in every time step delivers a tuned parameter set K .
  • This set is taking into account the situation and in particular the current velocity.
  • the longitudinal controller of the traffic assist system may be implemented as a derivation of a sliding mode controller (SMC).
  • SMC sliding mode controller
  • the system's dynamic state with respect to the target is describe in a phase space spanned by the distance x and its derivative ⁇ dot over (x) ⁇ , which can be measured as the relative velocity.
  • the location of the current dynamic state in relation to this function determines the acceleration output.
  • the feed forward controller can take into account external disturbances such as the road slope or wind forces. Furthermore, the measured acceleration of the relevant target can be compensated.
  • the output may be switched between the minimum and maximum acceleration. This leads to an uncomfortable behavior in a real system, because the actuators are not working infinitely fast (referred as “chattering”).
  • linear transitions based on the distance of the current state (i.e. point in phase space) in relation to the sliding surface may be implemented to achieve a continuous output.
  • Specific areas, where, for example, the maximum acceleration is applied, are defined by additional surfaces in the phase space as depicted in FIG. 17 .
  • the parameters defining the shape and position of the surfaces may be adapted by the Situation Adaptation module, especially to make low-speed driving more comfortable. This virtually adds a third dimension to the phase space.
  • the use of this adapted SMC allows an intuitive specification and parameterization of the longitudinal behavior of the SV.
  • the identical controller can be used for implementation in different vehicles by adapting the subsidiary engine controller to the vehicle characteristics.
  • the system may include an adaptive cruise control (ACC) type of interface.
  • ACC adaptive cruise control
  • an indicator may be provided with symbols or indicia to indicate to the driver what is preventing the system from operating. For example, if the target vehicle is not detected, the system may highlight a figure of a leading car in yellow. To reduce confusion about the system state and warnings without diverting too much attention from the road, audible or voice alerts may be provided. For example, the activation of the system may be confirmed by the spoken message “Automated Drive Active”.
  • the system may request a certain driver reaction, such as, for example, pulling an ACC lever or input, which may correspond to a re-activation of the traffic driving assist function.
  • the driver may perform this action at any time within the period to reset the time counter. If the driver does not perform the required input within the time period following the request or alert, the demand for this reaction may be achieved by a low priority warning with a text or iconistic or audible message for the driver. If no reaction of the driver is still detected, the system may request the driver to take over the guidance and otherwise will transition into a Safe Mode and slow the vehicle to a stop.
  • the present invention provides a traffic jam assist function using a mono-camera as the principle sensor.
  • the system can provide improved driving conditions for many commuters.
  • the system may include integrating a detection and response to traffic signs and traffic lights and other road features to allow compliance with traffic rules while driving automated are to be integrated. Automated lane changes may be provided on basis of low cost sensor fusion and environmental modeling. These features, complimented by increased confidence in the system, will allow the system to optionally operate at higher speeds and to operate without another vehicle to follow.
  • the present invention may be operable to determine if a vehicle ahead of the subject vehicle changes course as it travels through a “blind” intersection, whereby the system may determine that such a change in course is indicative of a lane shift or an object ahead of the subject vehicle.
  • intersections There are many intersections that are crested. In some cases, it is because one road used to be the through-way while the other had to stop (and now there is a traffic light) or it might be due to coming up a hill and crossing a road that rides along the ridge (such as with some streets of San Francisco).
  • One of the biggest clues is the vehicle traveling ahead of the equipped or subject vehicle. If the leading vehicle crosses the intersection and moves to one side, the system (via processing of data captured by the forward facing camera or sensor) can use this information as a clue for the self-guided vehicle (in the absence of onboard maps that may show how the road changes at the intersection).
  • the system responsive to a determination of a shift by the leading vehicle, is operable to adjust the course for the subject vehicle as it crosses the intersection, and may then further adjust the course as the view unfolds as the vehicle continues across the intersection.
  • the camera or sensor may comprise any suitable camera or sensor.
  • the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
  • the system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras.
  • the image processor may comprise an EYEQ2 or EYEQ3 image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects.
  • the vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like.
  • the imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640 ⁇ 480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array.
  • the photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns.
  • the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,
  • the system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. patent application Ser. No. 13/202,005, filed Aug. 17, 2011, now U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.
  • the imaging device and control and image processor and any associated illumination source may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,937,667; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and/or 6,824,281, and/or International Publication Nos.
  • the imaging array sensor may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos.
  • a vehicle vision system such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos.
  • a video mirror display may be disposed rearward of and behind the reflective element assembly and may comprise a display such as the types disclosed in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,370,983; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, and/or in U.S. Publication Nos. US-2006-0061008 and/or US-2006-0050018, which are all hereby incorporated herein by reference in their entireties.
  • the display is viewable through the reflective element when the display is activated to display information.
  • the display element may be any type of display element, such as a vacuum fluorescent (VF) display element, a light emitting diode (LED) display element, such as an organic light emitting diode (OLED) or an inorganic light emitting diode, an electroluminescent (EL) display element, a liquid crystal display (LCD) element, a video screen display element or backlit thin film transistor (TFT) display element or the like, and may be operable to display various information (as discrete characters, icons or the like, or in a multi-pixel manner) to the driver of the vehicle, such as passenger side inflatable restraint (PSIR) information, tire pressure status, and/or the like.
  • PSIR passenger side inflatable restraint
  • the display or displays and any associated user inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 7,289,037; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,386,742 and/or 6,124,886, and/or U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties.

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

A vehicular control system includes a camera disposed at a windshield of a vehicle and viewing forward through the windshield. With the equipped vehicle moving in a forward direction, a control, via image processing at an image processor of image data captured by the camera, determines presence of a pedestrian ahead of the vehicle and in the field of view of the camera. The control, via image processing at the image processor of image data captured by the camera, determines if the pedestrian present ahead of the equipped vehicle is moving across a path of travel of the equipped vehicle. The control, at least in part responsive to determination that the pedestrian is moving across the path of travel of the equipped vehicle, reduces forward speed of the equipped vehicle to allow the pedestrian to move out of the path of travel of the forward moving equipped vehicle.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • The present application is a continuation of U.S. patent application Ser. No. 15/996,727, filed Jun. 4, 2018, now U.S. Pat. No. 10,688,993, which is a continuation of U.S. patent application Ser. No. 14/568,177, filed Dec. 12, 2014, now U.S. Pat. No. 9,988,047, which claims the filing benefits of U.S. provisional applications, Ser. No. 61/953,970, filed Mar. 17, 2014, Ser. No. 61/919,133, filed Dec. 20, 2013, and Ser. No. 61/915,218, filed Dec. 12, 2013, which are hereby incorporated herein by reference in their entireties.
  • FIELD OF THE INVENTION
  • The present invention relates to imaging systems or vision systems for vehicles.
  • BACKGROUND OF THE INVENTION
  • Use of imaging sensors in vehicle imaging systems is common and known. Examples of such known systems are described in U.S. Pat. Nos. 5,949,331; 5,670,935 and/or 5,550,677, which are hereby incorporated herein by reference in their entireties.
  • SUMMARY OF THE INVENTION
  • The present invention provides a vision system or imaging system for a vehicle that utilizes one or more cameras to capture images exterior of the vehicle (such as forwardly and rearwardly of the vehicle), and provides the communication/data signals, including camera data or image data, that may be displayed at a display screen that is viewable by the driver of the vehicle, such as when the driver is backing up the vehicle, and that may be processed and, responsive to such image processing, the system may detect an object at or near the vehicle and in the path of travel of the vehicle, such as when the vehicle is backing up. The vision system may be operable to display a surround view or bird's eye view of the environment at or around or at least partially surrounding the subject or equipped vehicle.
  • According to an aspect of the present invention, a vision system for a vehicle includes one or more cameras or image sensors disposed at a vehicle and having respective fields of view exterior of the vehicle, and an image processor operable to process data transmitted by the cameras. The vision system includes a forward facing camera module (having image processing circuitry incorporated therein) and also include a rearward facing vision camera (for capturing video image data that is displayed on a display of the vehicle for viewing by the driver of the vehicle during a reversing maneuver) and/or sideward facing cameras. The vision system may provide a variety of functions by utilizing captured image data from one or more of the cameras at the vehicle, such as a forward viewing camera, a rearward viewing camera, side viewing cameras and/or a forward viewing windshield mounted camera (having a field of view through the windshield of the vehicle). The vision system may have a front windshield camera module that may have image data processing capabilities for that camera and for one or more other cameras of the vehicle, or multiple cameras (such as a forward viewing camera at a forward portion of the vehicle, a rearward viewing camera, side viewing cameras, a forward viewing camera that views through a windshield of the vehicle, and optionally a night vision camera) may feed into a common image data processing module. The vision system of the present invention may be operable to determine (such as via image processing of captured image data and via a speed of the vehicle) when the vehicle is traveling in traffic, such as high volume traffic, a traffic jam situation or the like, and a controller or control system may control or autonomously drive the vehicle during such traffic conditions.
  • Optionally, when controlling the vehicle in a traffic driving condition, the system may determine when a lane splitting vehicle (such as a motorcycle or motor scooter or bicycle or other small vehicle) is driving between lanes of traffic (commonly referred to as lane splitting) and may control the subject vehicle accordingly. For example, when the system detects a vehicle or motorcycle approaching (such as from behind the vehicle or ahead of the vehicle) at the left side, the system may move the subject vehicle towards the right side of the subject vehicle's lane or occupied lane and away from the lane splitting motorcycle, while still remaining in the occupied lane. After the detected motorcycle passes, the system may move the subject vehicle back towards the center of the occupied lane. Also, responsive to detection of a lane splitting motorcycle or the like, the system can limit or prevent lane change if such a lane change would result in collision with the lane splitting motorcycle.
  • Optionally, when controlling the vehicle in a traffic driving condition, the system may determine when more lanes of traffic begin, such as when two lanes of traffic change to three or four lanes as the vehicles move closer together to create additional lanes to enhance traffic flow. Responsive to such a determination, the system may automatically select a leading vehicle or “lane of traffic” to follow and follow that vehicle even if that results in the subject vehicle leaving the road lane that it had been occupying.
  • Optionally, the system may be operable to detect pedestrians and may slow or stop to allow for pedestrians to cross the road in front of the vehicle as may occur in high volume traffic situations and/or fast or slow moving traffic situations, such as in a crowded city street or the like. For example, the control system, at least in part responsive to detection of a stationary pedestrian in the path of travel of the equipped vehicle, may be operable to stop the equipped vehicle. Also, for example, the control system, at least in part responsive to detection of a moving pedestrian in the path of travel of the equipped vehicle, may be operable to slowly move the equipped vehicle forward at a speed that allows the pedestrian time to move out of the path of travel of the equipped vehicle.
  • The system thus is operable to determine the driving condition or traffic condition of the subject vehicle and, when that determined condition is indicative of traffic, such as high volume traffic or slow moving traffic or a traffic jam, the system may control the subject vehicle to drive the subject vehicle in the traffic. The system detects the surrounding vehicles and determines the appropriate driving direction and speed for the subject vehicle to move the subject vehicle with the traffic flow and to adapt the driving of the subject vehicle to the surrounding vehicles and traffic flow. The system is operable to detect pedestrians and may slow or stop to allow for pedestrians to cross the road in front of the vehicle as may occur in high volume traffic situations and/or fast or slow moving traffic situations, such as in a crowded city street or the like. Also, the system may drive the vehicle forward slowly and carefully if all pedestrians in front of the vehicle are moving, such as in a manner that would result in the pedestrians being out of the path of travel of the vehicle when the vehicle is at the current location of the detected pedestrians, or such as in a manner that allows the pedestrians time to move out of the path of travel of the vehicle (allowing the pedestrian time to change their path by the time the vehicle arrives at their initial location).
  • These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a plan view of a vehicle with a vision system and imaging sensors or cameras that provide exterior fields of view and may provide information to the driver via a display in accordance with the present invention;
  • FIG. 2 is a flow chart of the vision system control of an equipped vehicle in accordance with the present invention, showing vehicle control in a lane splitting situation;
  • FIG. 3 is a schematic of how the vision system controls equipped vehicles in accordance with the present invention, showing vehicle control in a lane splitting situation;
  • FIG. 4 is a flow chart of the vision system control of an equipped vehicle in accordance with the present invention, showing vehicle control in an expanding lane situation;
  • FIG. 5 is another schematic of how the vision system controls equipped vehicles in accordance with the present invention, showing vehicle control in an expanding lane situation;
  • FIG. 6 is another schematic of how the vision system controls equipped vehicles in accordance with the present invention, showing vehicle control in a lane merging situation when the equipped vehicle accepts another vehicle's attempt to merge ahead of the equipped vehicle;
  • FIG. 7 is another schematic of how the vision system controls equipped vehicles in accordance with the present invention, showing vehicle control in a lane merging situation when the equipped vehicle rejects another vehicle's attempt to merge ahead of the equipped vehicle;
  • FIG. 8 is another schematic of how the vision system controls equipped vehicles in accordance with the present invention, showing vehicle control in pedestrian situations, where the vehicle stops when a non-moving pedestrian is in the immediate path of travel of the vehicle;
  • FIG. 9 is another schematic of how the vision system controls equipped vehicles in accordance with the present invention, showing vehicle control in pedestrian situations, where the vehicle moves slowly forward when moving pedestrians are in the path of travel of the vehicle or have a trajectory which could cross the path of the vehicle;
  • FIG. 10 is a schematic of the functional structure of the traffic assist system of the present invention;
  • FIG. 11 is an image showing an overlay of object selection and path planning data for use in system analysis;
  • FIG. 12 is an example of the operation of the traffic assist system of the present invention;
  • FIGS. 13 and 14 are schematics of how the vision system of the present invention controls the equipped vehicle when another vehicle wants to pull out in front of or behind the equipped vehicle, such as when the equipped vehicle is blocking or partially blocking a driveway or the like;
  • FIG. 15 is a schematic of the functional principle of the lateral controller of the system of the present invention;
  • FIG. 16 is a schematic of the functional principle of the longitudinal controller of the system of the present invention; and
  • FIG. 17 is a chart showing an exemplary location of the sliding surface (a=0) and defined areas in the phase space for th=1 s,
  • v e g o = 1 0 m s .
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • A vehicle vision system and/or driver assist system and/or object detection system and/or alert system and/or control system and/or autonomous vehicle control system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes a processor that is operable to receive image data from the vehicle cameras and may provide a displayed image that is representative of the subject vehicle (such as for a top down or bird's eye or surround view, such as discussed below). The vision and display system may utilize aspects of the systems described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, and/or U.S. patent application Ser. No. 13/894,870, filed May 15, 2013, and published Nov. 28, 2013 as U.S. Publication No. US-2013-0314503, and/or Ser. No. 12/405,558, filed Mar. 17, 2009, now U.S. Pat. No. 9,019,090, which are hereby incorporated herein by reference in their entireties. The vision system may have a front windshield camera module that may have image data processing capabilities for that camera and for one or more other cameras of the vehicle, or multiple cameras (such as a forward viewing camera at a forward portion of the vehicle, a rearward viewing camera, side viewing cameras, a forward viewing camera that views through a windshield of the vehicle, and optionally a night vision camera) may feed into a common image data processing module, such as by utilizing aspects of the vision systems described in U.S. patent application Ser. No. 13/894,870, filed May 15, 2013, and published Nov. 28, 2013 as U.S. Publication No. US-2013-0314503, which is hereby incorporated herein by reference in its entirety.
  • Referring now to the drawings and the illustrative embodiments depicted therein, a vehicle 10 includes an imaging system or vision system 12 that includes at least one exterior facing imaging sensor or camera (such as a wide angle camera or multiple sensors on a single camera or the like), such as a rearward facing imaging sensor or camera 14 a (and the system may optionally include multiple exterior facing imaging sensors or cameras, such as a forwardly facing camera 14 b at the front (or at the windshield) of the vehicle, and a sidewardly/rearwardly facing camera 14 c, 14 b at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (FIG. 1). The vision system 12 includes a control or processor 18 that is operable to process image data captured by the cameras and may provide displayed images at a display device 16 for viewing by the driver of the vehicle (although shown in FIG. 1 as being part of or incorporated in or at an interior rearview mirror assembly 20 of the vehicle, the control and/or the display device may be disposed elsewhere at or in the vehicle).
  • The vision system may also operate in conjunction with other sensors of the vehicle, such as RADAR sensors or LIDAR sensors or Time-of-Flight (TOF) sensors or Ultrasonic sensors or the like. The system thus may be operable to provide enhanced detection of objects or other vehicles at or near the subject or equipped vehicle and may determine the distance to the objects or other vehicles and the speed and directional heading of the detected objects or other vehicles relative to the equipped vehicle.
  • The system of the present invention is operable to provide a driver assist or traffic jam assist function (providing lateral and longitudinal control in pedestrian and/or traffic scenarios (low speed, controlled access road)). Optionally, the vision system of the present invention may provide various features, such as, for example, a full autonomous driving function including autonomous lane change to overtake slower cars, construction area driving and lane merges, an autonomous pull-over maneuver function in case of an incapacitated and/or unresponsive driver, an automatic trailer hookup function (which is operable to guide the vehicle to a trailer), an automatic cruise control (ACC) automatic go in a stop and go ACC (such as for city driving conditions/environments), an enhanced automatic emergency braking (AEB) function based on rear traffic (optionally, for example, overriding or not braking or delaying braking if rear traffic is present), a blind spot detection function (to limit or prevent accidents during lane change maneuvers), an onramp assist function (to predict whether the equipped vehicle can accelerate enough to merge with existing traffic before the end of the onramp), a low speed CMB/pedestrian function (with a wider field of view to detect pedestrians that are relevant for impact while driving at very low speeds (such as around 1-2 m/s or thereabouts or more or less), a prevent running red lights function (such as by generating an alert and/or optionally braking the vehicle), an alert to go when a traffic light changes to green, a better lane detection function in low sun or low lighting conditions (with improved availability of lane information such as, for example, for LKA, LDW and the like), a trailer backup function (which is operable to automatically steer the vehicle based on a driver selected trajectory), an automatic parking (parallel, perpendicular) function with drive in control of longitudinal movement, an autonomous/remote controlled parking (parallel, perpendicular) function, a traffic sign recognition (TSR) extension to height limitation signs, a parking path height detection function, an AEB function during a reversing or backup maneuver, a traffic sign recognition (TSR) to set ACC speed (so as to provide a speed limiter function or the like), a ball detection function, a pedestrian impact detection function to activate a pedpro system (such as to use a camera to replace an existing sensor or in addition to another impact sensor), a road friction estimation function (such as for determining if the vehicle is traveling on snow, gravel, ice or the like) to adjust the AEB thresholds and/or curve speed warning, a pothole depth and speed bump height estimation function for an active suspension control, a read license plate of preceding/following vehicle function (such as, for example, for Amber Alert notifications and the like), a curb detection/warning if a curb is too high to drive onto (such as if the vehicle is being driven towards a curb, so as to limit or prevent damage to the wheels or rims of the vehicle), an application of 3D information to parking situations function, a perspective correction function for a more accurate birds eye view (more realistic image), an ACC function that limits or precludes acceleration of the subject vehicle when the subject vehicle is being overtaken by another vehicle, and/or a lighting control function (such as providing an adjust lighting decision based on knowledge or other car overtaking of driving parallel to the subject vehicle), and/or the like.
  • The present invention provides a vehicle vision system that is operable to provide semi-automated driving and/or hands free driving to assist the driver in maneuvering the vehicle in traffic conditions so the driver of the equipped vehicle can relax and not have to operate the vehicle during the typical starting/stopping/weaving driving conditions of a traffic situation. The system of the present invention provides autonomous or semi-autonomous vehicle control in a traffic environment (such as high volume traffic conditions or traffic jam conditions or the like), and may take control of the vehicle responsive to detection of a high or threshold level traffic environment, such as responsive to image processing of captured image data (such as when the image processing determines that the equipped vehicle is generally or at least partially surrounded by other vehicles in a manner indicative of a traffic condition) and the speed of the equipped vehicle. For example, the system may only provide such control of the vehicle when the vehicle is traveling at lower speeds, such as below about 25 kph or below about 50 kph or below about 70 kph or thereabouts.
  • The present provides semi-autonomous driving capability utilizing the image data captured from multiple vehicle cameras, such as five exterior viewing cameras (such as, for example, a front mounted forward viewing camera, a rear mounted rearward viewing camera, side mounted sideward/rearward viewing cameras and a windshield mounted forward viewing camera or camera module). The system is operable to track lane markings and to position the vehicle at the road based on information from the four cameras at the exterior of the vehicle, such as when the windshield camera can no longer see the lane markings due to dense traffic, for example. As the leading vehicles ahead of the equipped vehicle come in close, it may not be possible to see twenty meters of lane markings from the windshield camera or front mounted camera, but it may be possible to see the lane markings alongside or behind our vehicle using the other cameras. This can be used to position the vehicle for autonomous driving.
  • When the lane markings are not determinable, such as in a city environment, it may be possible to use other information, such as adjacent vehicles or road constructions (such as curbs or the like) to delineate the path of travel of the equipped vehicle. In cases where lane splitting or lane sharing is allowed, the vehicle surround view cameras may be used to identify approaching motorcycles or bicycles that may be traveling between lanes of traffic. When such approaching small vehicles are detected, the vehicle may adjust its path of travel within its occupied lane to create space for the approaching motorcycle/bicycle.
  • The system of the present invention is thus operable to detect traffic behaviors, and may control the equipped vehicle to maneuver the vehicle with the traffic flow in a manner that provides safe travel and that controls the vehicle so that the vehicle is driven in a similar manner as the other vehicles on the road, in order to enhance traffic flow. The system may use any suitable processing means or protocol to determine the traffic conditions and to detect vehicles and/or pedestrians on or off the road being traveled by the equipped vehicle and on or off cross roads and merging roads at or near the traveled road. The system may learn or adapt the driving or control of the vehicle (such as during the driving or control of the vehicle or before taking control of the vehicle) responsive to the driving or maneuvering of other vehicles on the road. The system may adapt the driving or control of the vehicle responsive to a geographical location of the equipped vehicle to provide regional localization control, in order to adapt the autonomous control or driving to the driving characteristics of that location or region (for example, drivers drive differently in California, Paris and Italy, where it is acceptable for motorcycles and the like to drive along lane markers and between lanes of traffic).
  • Responsive to detected vehicles and objects and pedestrians in a determined traffic situation, the system of the present invention may control the vehicle to react to a determined hazardous condition or danger. For example, the system may move the vehicle to one side or the other, or may prevent a lane change by the driver of the vehicle, or may prevent a door opening by an occupant of the vehicle or the like, depending on determined objects or vehicles at or near or approaching the equipped vehicle. The system may interact with the determined other vehicles and may provide control of the vehicle motion or steering/accelerator/brakes of the equipped vehicle, and may control a turn signal of the vehicle to interact with other vehicles and drivers and systems on the road.
  • For example, and with reference to FIGS. 2 and 3, the vision system or control system of the present invention is operable, at least when in a traffic situation, to detect a lane splitting vehicle, such as a motorcycle cutting through slow moving traffic and between vehicles traveling along two adjacent lanes, and to control the equipped vehicle accordingly. The system may utilize a side camera and/or a rear camera and/or a blind spot radar sensor or the like to detect the presence or approach of a lane splitting vehicle (such as in conjunction with a lane marker detection so the system is aware of the lane delineations for the lane in which the equipped vehicle is travelling). As can be seen with reference to FIGS. 2 and 3, when a lane splitting vehicle 30 (FIG. 3) is determined to be approaching from the rear left side of the equipped vehicle 40, the control system may steer the vehicle to the right, while remaining within the lane that the equipped vehicle occupies (in other words, moving the vehicle towards the right side lane marker but without making a lane change). Such movement allows for more room for the lane splitting vehicle to pass at the left side more safely. After the lane splitting vehicle passes, the system may control the equipped vehicle to move back towards the center of the occupied lane, and then may travel with the traffic flow along the occupied lane.
  • Optionally, the system may similarly determine when a vehicle is driving along a center lane (such as for example, where, such as in Russia, vehicles typically drive along center lanes in high traffic conditions), and may control the equipped vehicle accordingly. For example, the system may, responsive to a determination that vehicles are traveling along the center lane (or where vehicles are queuing or traveling regardless of lane delineations), control the equipped vehicle to follow those vehicles to enhance traffic flow, even if it requires moving the equipped vehicle out of the occupied marked lane of the road. Such a function may utilize image processing and/or data processing of outputs of various sensors, such as, for example, cameras of a surround view system, a windshield mounted forward facing camera, a blind spot radar or lidar sensor or the like.
  • Optionally, the system may determine when vehicle travel or vehicle “lanes” are different from the lanes marked on the road, and may control the equipped vehicle to follow one of the leading vehicles. For example, and with reference to FIGS. 4 and 5, the control system may detect an increase in traffic or a shift in traffic ahead of the equipped vehicle (such as when there are more lanes of vehicles than road lanes) and may determine which path or line of vehicles to follow. The system may select a faster moving line of vehicles or a particular side or direction (such as, for example, the right side line of vehicles when the equipped vehicle is approaching an exit or right turn along its selected or predetermined route), and may control or steer the vehicle to follow the vehicles of the selected line of vehicles. The system thus selects or chooses a target vehicle to follow, which may be a faster vehicle and/or may be partially occupying the road lane that the equipped vehicle is traveling, and such a selection may be based at least partially on the intent of a vehicle adjacent to the equipped vehicle. For example, and as shown in FIG. 5, if the equipped vehicle 40 selects a vehicle 50 to the left to follow, but a left side adjacent vehicle 60 is moving to follow that vehicle (or otherwise indicates that it intends to follow that vehicle), the system may select a different vehicle to follow or may adjust the driving to fall in behind the adjacent vehicle or the like.
  • In such traffic situations, the system overrides any lane departure warning system or alert and steers the equipped vehicle outside of its occupied lane and may even continue to drive the vehicle along a lane marker and thus not in any marked lane during such a traffic condition, and may even drive the vehicle partially onto or fully onto a shoulder of the road to follow the selected line of vehicles. Optionally, the system may also utilize a navigation system and/or pavement detection or the like to make sure that the equipped vehicle stays on its intended or selected course or route when following vehicles outside of the road lanes. The system may alert the driver that the vehicle or system is entering this special driving mode before entering or commencing that mode and during the out of lane maneuvers.
  • When controlling the equipped vehicle in slow heavy traffic conditions, the system may determine that a better path involves a lane change, such as to follow a faster moving line of vehicles in an adjacent lane. Thus, the system may be operable to steer the vehicle to one side or the other to enter the adjacent lane when traffic permits. In such a situation, the system may determine when a gap between consecutive vehicles (a leading vehicle and a trailing vehicle following the leading vehicle along the same or similar path and/or in the same lane of traffic) in an adjacent lane is sufficient to begin moving over and into the adjacent lane, and the system may steer the vehicle towards and into that gap (and may actuate the turn signal accordingly, if such turn signal use is appropriate), such as in a manner that allows the following vehicle behind the gap to slow to allow the equipped vehicle to enter the gap. The system may be operable to determine when the other vehicles do not allow such a lane change maneuver (such as when the other vehicle does not slow down to allow the lane change maneuver), and may return the vehicle to its lane and try again at a later gap. The system may adjust its control or sensitivities responsive to the geographical location and/or driving behavior of the other vehicle drivers, and may learn or adapt responsive to the current driving conditions and driving behaviors.
  • Likewise, when driving in slow heavy traffic conditions, the system may determine when another vehicle driver wants to cut into the lane of the equipped vehicle ahead of the equipped vehicle. Responsive to such a determination, the system may (such as shown in FIG. 6) slow or stop the equipped vehicle 40 to allow for the other vehicle 60 to cut in, or may (such as shown in FIG. 7) move the equipped vehicle 40 to the side to reject the cut in but partially allow the other vehicle 60 to continue traveling (where eventually the other vehicle should slow and fall in behind the equipped vehicle), depending on the particular driving conditions and traffic flow and traffic situation. The system may adapt or calibrate its sensitivity or processing so that, after one vehicle cuts in, the system is less tolerant of other vehicles cutting in as well, to avoid a potential situation where the system stops the vehicle and allows a steady stream of other vehicles to cut in ahead of the equipped vehicle.
  • The system of the present invention may also be operable to determine the “body language” of other drivers or vehicles to determine the intent of the driver of the other vehicle. For example, in some areas, such as in China, some drivers open the door of the vehicle to signal and/or force merging into an adjacent lane or line of traffic. The system of the present invention is operable to determine such actions and control the vehicle accordingly (such as to slow the equipped vehicle to allow for the cut in when it is determined that the leading or merging vehicle has its door open). Such a determination may be made via image processing of captured image data by one or more forward facing cameras of the equipped vehicle or by processing of outputs of ultrasonic sensors or the like of the equipped vehicles.
  • In all of the above high traffic or traffic jam situations, the system of the present invention may be operable to determine (such as via image processing of image data captured by side and/or forward facing cameras or night vision cameras, and/or outputs of radar sensors or ultrasonic sensors or lidar sensors of the equipped vehicle) the presence of one or more pedestrians at or near the equipped vehicle and ahead of the equipped vehicle. Responsive to a determination of at least one pedestrian ahead of the vehicle, the system may adjust control or driving of the equipped vehicle in order to ensure avoidance of any contact with the pedestrian or pedestrians by the autonomously driven or semi-autonomous equipped vehicle. For example, and with reference to FIGS. 8 and 9, responsive to a determination of the presence of a pedestrian or pedestrians 70 ahead of the equipped vehicle 40, the system may stop the vehicle or maneuver the vehicle so as to exclude any and all paths that are occupied or partially occupied by one or more stationary pedestrians. Optionally, when a stationary or non-moving pedestrian 72 (such as shown in FIG. 8), is determined to be present in the path of travel of the vehicle 40, the system may stop the vehicle to avoid collision with the non-moving pedestrian. When moving pedestrians 70 are detected (such as pedestrians crossing the road through the traffic), such as shown in FIG. 9, the system may determine a predicted path of the pedestrian or pedestrians and may maneuver the vehicle or slow the vehicle to make sure that the equipped vehicle avoids any conflict or potential conflict or collision with the crossing pedestrian.
  • Optionally, the system may drive the vehicle forward slowly, even when one or more pedestrians are determined to be in the path of travel of the vehicle (or determined to have a trajectory that will lead them into the path of travel of the vehicle if their trajectory persists), whereby the vehicle will continue to travel forward if the pedestrians move out of the way (the system can determine, such as responsive to detection of movement of the pedestrians ahead of the vehicle, that a collision with a pedestrian would not be immediate or imminent and may expect the pedestrians to walk out of the path of travel). The system thus may drive the vehicle slowly forward at a slow constant or substantially constant speed, so that the pedestrians can readily perceive the autonomous vehicle's intent and readily move out of the way of the slowly moving vehicle or change their trajectory to avoid the path of the slowly moving vehicle.
  • The present invention thus provides a system that is operable to determine the driving condition or traffic condition of the subject or equipped vehicle and, when that determined condition is indicative of a traffic jam or high traffic volume or slow moving traffic condition or faster moving traffic condition, the system may control the equipped vehicle (such as by controlling the brake system, the accelerator and steering system of the vehicle) to maneuver or drive the subject vehicle in the traffic. The system detects the surrounding vehicles and determines the appropriate driving direction and speed for the equipped vehicle to move the equipped vehicle with the traffic flow and to adapt the driving of the equipped vehicle to the surrounding vehicles and traffic flow. The system is operable to detect pedestrians and may slow or stop to allow for pedestrians to cross the road in front of the equipped vehicle as may occur in traffic situations, such as in a crowded city street or the like.
  • Optionally, the vision system may utilize rear image processing for lane detection. For example, the system may apply lane detection and tracking aspects from front image processing to rear images captured by one or more rearward facing cameras of the vehicle. The system may detect the lane markings and may determine the lateral distance to a left or right lane marking, and may control steering and/or provide an alert to the driver responsive to the detected distance to the lane markings. The system may utilize the rearward lane marking detection to provide enhanced detection of a lane splitting vehicle or motorcycle or scooter. The system thus provides increased availability of lane information to the driver, and may warn the driver even where lane departure prevention (LDP) from the front camera may not be available, such as in low lighting conditions or situations, traffic jams (when preceding vehicles block lane markings), tunnel entry and/or the like. Optionally, it is envisioned that the rear lane detection of the present invention may be used for autonomous driving/lane keeping where high lane data availability is important.
  • Optionally, the vision system of the present invention may be operable to provide other various functions. For example, the vision system may operate with or be associated with an adaptive automatic emergency braking (AEB) system of the vehicle, such that, when the subject vehicle determines that braking is desired or appropriate, the subject vehicle may, for example, brake earlier and/or harder, if no vehicle is following (as determined by image processing of the image data captured by the rearward facing camera), and risk of rear end collision is low, or may brake later, if the vision system determines that a vehicle is following, and the risk of rear end collision is higher. Optionally, the vision system may provide improved or enhanced lane detection at low sun/and increased availability of LDW based on detecting lanes in the rear camera images. Optionally, the vision system may provide rear pedestrian detection, and may provide a warning or may brake if a pedestrian is detected in the rear images, such as during a reversing maneuver of the vehicle. Optionally, the vision system may provide a rear object detection, and may provide a warning or the like if a general object is detected in the rear images. Optionally, the vision system may provide a rear cross traffic alert function, and may detect or determine crossing vehicles and may be operable to alert or warn the driver of the subject vehicle of crossing vehicles when the driver is reversing or backing up the subject vehicle. The system of the present invention may utilize aspects of the systems described in U.S. patent application Ser. No. 13/894,870, filed May 15, 2013, and published Nov. 28, 2013 as U.S. Publication No. US-2013-0314503, which is hereby incorporated herein by reference in its entirety.
  • The present invention may also or otherwise provide enhanced control or semi-autonomous driving features to assist the driver of the vehicle during high traffic conditions or situations, such as during a commute to or from work at rush hour type traffic. Commuter mobility has been constantly increasing over the past decades. The number of accidents shows the same tendency and is not decreasing significantly although new active and passive safety features are introduced in modern vehicles. Automated driving has the potential to improve the safety and productive time of commuters. However, many automated driving functions in development are equipped with expensive sensors. For automated driving to become affordable to the public it must be reduced in cost.
  • The present invention provides a limited semi-autonomous driving system for traffic jam situations that may operate with only a single mono-camera sensor (although clearly the semi-autonomous driving assist system of the present invention may operate using multiple cameras of the vehicle). The assist system is focused on relieving the driver of the mundane task of driving in heavy traffic. It is capable of operating in a low speed range and it does not require any additional HMI or actuation beyond what is already available in vehicles with Lane Keep Assist or Cruise Control or the like.
  • The present invention utilizes an algorithm that achieves a partially automated driving function while providing a cost-effective hardware setup. The environment sensing may be performed by a monocular camera only and the software may work on a single ECU.
  • The traffic driving assist system of the present invention may decrease the driver's workload in monotonous situations in congested traffic. The driver or user can activate the driver assist function in a slow traffic situation to relieve monotony. Because the system of the present invention may be intended to be a comfort function for congested traffic on well-structured roads, optional preconditions for activation of the system may include the availability of lane markings and existence of a preceding vehicle within a certain distance. The user may activate the system during a traffic condition by activating a user input, such as by pressing a button or the like. After this, the driver no longer needs to provide steering, accelerator or brake pedal input while in the traffic condition. The driver assist function will take over the lateral guidance of the equipped vehicle and keep it generally centered in the lane. Also, the longitudinal behavior is controlled by the system to follow the closest target vehicle in the same lane (Same Lane Target), maintaining an appropriate amount of distance or gap between the target vehicle and the equipped vehicle. If the target vehicle stops then the equipped vehicle will come to a stop close behind the target vehicle and will resume driving if the target vehicle then continues, all without any further input needed from the driver of the equipped vehicle.
  • The system user may still be responsible for the vehicle and its motion. Thus, while the vehicle driving is automated, the driver may monitor the system and can intervene if necessary. Also, the driver may override the system (such as by taking control of the steering or acceleration) at any time.
  • If the automated operation of the system falls outside of its operational bounds or is becoming unsafe (such as, for example, if the target vehicle is no longer tracked ahead of the equipped vehicle), the system may be operable to warn the driver and request driver takeover of the vehicle. The system may continue to control the vehicle in the safest manner possible with limited information until the driver takes over control of the vehicle. This provides time for a possibly inattentive driver to react to the system request or alert. If the driver does not act within a certain amount of time following the warning or alert, the system may disengage or enter a safe state and gently slow the vehicle to a stop.
  • The system requirements and operational parameters may be derived from an analysis of traffic jam data collected with human drivers. For example, the velocity range in which the system may operate may be defined to allow automated driving in most traffic jam scenarios, such as at speeds below about 70 km/hr (data has shown that 99 percent of the traffic jam driving speed range is below 70 km/hr). Thus, in order to cover these situations and also to accommodate urban speed limits in the U.S., the maximum speed at which the system may control the vehicle may be set to about 72 km/hr (or about 45 mph).
  • Traffic data shows that acceleration values occurring in congested traffic are mostly below about 1.5 m/s2, so the limit of the system may be set to this value, especially to allow quick starting after standstill without opening a large gap to the preceding vehicle. For deceleration in regular driving situations, a threshold of about −1.3 m/s2 is reasonable, since most of the situations can be handled within this range. The global minimum of the allowed deceleration was set to 4 m/s2 to also enable the system to react to more critical situations, such as, for example, close cut-in maneuvers or the like. More critical scenarios can be handled by designated safety functions with higher deceleration (such as an Automatic Emergency Braking (AEB), which may be based on the same camera system, and which may be implemented as a backup system to take control if the driving situation encountered is rated too critical for the traffic assist system).
  • Because the system is at first intended for motorway use, the radius of the driven trajectory can be assumed to contain no sharp bends. Therefore, a minimum radius of curvature of the road trajectory may be defined to be, for example, about 180 m. The maximum duration of system operation without any driver interaction may be restricted in order to insure that the driver's attention will not drift and that he or she is still alert. This time period may be set to, for example, about 120 seconds before a driver reaction is actually requested by the system.
  • To ensure the compatibility of the system to a variety of different vehicles, the global structure of the architecture allows for modularity. The same code can therefore be used in different setups or vehicles.
  • The functional structure of the system of the present invention is shown in FIG. 10. The Path Planning module calculates the desired behavior based on the sensor data. First, the Target Object Selection module extracts and flags the traffic objects which are relevant for the decision. Afterwards, the actual longitudinal and lateral pathing is defined. Based on the condition of the vehicle and the targeted path data, the global system status is determined within a state machine. This status determines if the system is enabled and the function is ready to be activated. Because the lateral and longitudinal guidance can be active at the same time or independently there, are states for all three possibilities. Additionally, a Safe Mode state may be implemented for the case that a safe operation cannot be guaranteed, such as, for example, when there is an extended period of driver unresponsiveness.
  • To adapt the controller characteristics to the particular driving situation, such as, for example, slow constant following, the situation is classified based on environmental, ego-vehicle or subject vehicle or equipped vehicle and path planning data. Class parameters are adapted according to the detected situation. Because the vehicle dynamics in both lateral and longitudinal dimensions are highly dependent on the actual velocity of the vehicle, the control parameters are also continuously adjusted in relation to the SV's current speed. The two controllers for longitudinal and lateral guidance generate the actual control signals that are sent to the vehicle interface. The lateral controller is implemented as a state controller and the longitudinal guidance is implemented as a sliding-mode controller.
  • The task of the Path Planning module is to take into account the environmental information provided by sensor perception and generate the trajectory. The preceding vehicle in the same lane (the Same Lane Target or SALT) usually has the most significant influence on the longitudinal trajectory. If calibrated to prevent overtaking on the right, which is illegal in many states, the closest target in the left adjoining lane (LALT) is also relevant if it is closer and moving more slowly than the SALT. The selection of all these relevant targets and calculation of their trajectories is the task of the Target Object Selection. An example for tagged objects is shown in FIG. 11. Also, some of the calculated output of the Path Planning module can be seen at the lower region of FIG. 11.
  • To determine the particular lane, in which the detected vehicle is located, the lane position is extrapolated based on the approximation:

  • y tgt =y lane +r lane ·x tgt+½·K lane ·x tgt 2  Eq. 1
  • where ytgt is the lateral position of the lane marking at the longitudinal position xtgt of the target, rlane is the relative heading angle between SV and the lane and Klane is the curvature of the lane. The input values describing the lane and objects are provided by the camera sensor and then recalculated by Path Planning.
  • In a situation where the left lane target vehicle (LALT) or right lane target vehicle (RALT) is determined to be uncomfortably close to the lane markings, then the path may be adapted to increase lateral separation. If only a single marking can be successfully tracked then the detected lane's information is mirrored for a certain time period under the assumption that the width of the lane is not changing quickly. After this duration, the virtual lane width is slowly reduced to a minimum to allow lateral guidance as long as possible. In the event that no lane markings are available or the distance to the SALT is small at a low velocity, the lateral path is based on the track of the SALT.
  • The fundamental longitudinal trajectory calculates the necessary target distance xfollow and relative velocity to the SALT to maintain headway th, according to:

  • x follow =t h ·v tgt  Eq. 2
  • where vtgt is the target vehicle's longitudinal velocity.
  • The analysis of traffic jam data has revealed that human drivers tend to maintain a headway time from up to 2 seconds. While it varies by state and country, law may specify a following time of 2 seconds as well. Thus, the headway time is set to th=2 s. An additional benefit of this setting has been observed while driving in real traffic. The spacious gap size between the SV and the preceding vehicle may result in cut-in maneuvers of other road users to not be too close. The function can more easily handle this critical situation with an extra distance to the cut-in vehicle.
  • The system is designed to stop the SV behind a stationary vehicle at the stopping distance xfollow≥xstop,min=4 m. For comfort and safety, the value also has some safety distance added, compared to human drivers who stop at a distance of about 2.5 m on average.
  • A Slow Mode condition has been implemented to treat the low speed approach to a slow moving preceding vehicle. It is triggered if:
  • v t g t x s top , max t h Eq . 3
  • In this case xfollow is set to xstop,max=6 m to allow slow approaching and adaptation of the controller characteristic. As soon as the lead vehicle has stopped Stop Mode becomes active (xfollow=xstop,min) and a smooth stop of the SV can be initiated by the controller.
  • Because availability and viewability and discernibility of lane markings is important for robust and accurate lateral guidance, the longitudinal Path Planning accounts for lane visibility. Hence, if the system is not in Slow Mode, the minimum targeted distance to the SALT is set to xstop=10 m to guarantee visibility even for obscured or intermittent markings on motorways.
  • As a safe response to vehicles which are cutting in or driving hazardously, vehicles that are detected to be partially in or near the SV's lane are immediately treated as the new SALT object. An example for this approach is visualized in FIG. 12.
  • Several methods are used to detect and respond to a close cut in situation. Before an effective tracking of this vehicle is possible, the system may generate an alert or warning flag, issued by the image processor, to report a close cut-in. When this alert is provided, the controller responds by commanding a gentle deceleration as a first reaction. Once the new SALT is identified, the step in the measured distance to the relevant target xtgt (such as at 36 seconds) reveals the first tracking of the merging vehicle. As soon as the tracking allows an appropriate adjustment of the gap, the full deceleration is sent to the interface. In spite of the critical cut-in at about 4.5 m in front of the SV, uncomfortable jerks in the actual acceleration aact can be avoided while the situation is resolved by the function.
  • One principal situation, which the system has to deal with, is the handling of stop-and-go traffic. When a preceding vehicle comes to a halt, the SV slowly approaches the stopped vehicle with a low, nearly constant velocity. When having reached about xtgt=6 m, deceleration is commanded to smoothly stop the vehicle. When the preceding vehicle starts again a quick response is crucial. After the leading or target vehicle starts to move and reaches the distance threshold (xtgt=4.5 m), the SV starts within 0.5 s. This ensures an immediate following without opening a substantial distance gap greater than the desired headway.
  • After the target trajectory is available, the lateral and longitudinal controllers' task is to output the particular commands to the vehicle interfaces. Lateral and longitudinal guidance are intended to also be working separately. Thus, two parallel controllers are preferred to an integrated concept.
  • Based on the desired trajectory and the SV state, the driving situation is classified. This allows an adaptation of the characteristics of the controller to provide a comfortable and safe vehicle motion without any high acceleration or jerk in either dimension.
  • Optionally, the vision system of the present invention may be operable to detect when the subject vehicle or equipped vehicle is not moving or moving slowly and is blocking a driveway or intersection and there is another vehicle that is trying to pull out into the lane occupied by the subject vehicle. Depending on the location of the other vehicle relative to the equipped vehicle, the system may stop the subject vehicle to allow the other vehicle to pull into the lane ahead of the subject vehicle or the system may move the subject vehicle forward to allow sufficient space behind the subject vehicle for the other vehicle to at least partially pull out into the lane behind the subject vehicle. For example, and as shown in FIG. 13, if the other vehicle 80 is near the front of the subject vehicle 40, such as forward of the front axle of the subject vehicle, then the system may apply the vehicle brakes to hold the vehicle's position to allow the other vehicle to pull into the lane of traffic ahead of the subject vehicle. Optionally, in such a situation, the system may also flash the head lights or honk the horn or provide some other signal to indicate to the driver of the other vehicle that they can proceed ahead of the subject vehicle.
  • Also, for example, and as shown in FIG. 14, if the other vehicle 80 is near the rear of the subject vehicle 40, the system may control the vehicle brakes and accelerator to creep forward to allow for the other vehicle 80 to pull into the lane of traffic behind the subject vehicle. In such slow moving traffic conditions, the system may provide a resting distance or gap between a leading vehicle, such as about five meters or thereabouts, so creeping forward a little would take up some of the resting distance or gap, while still spacing the subject vehicle from the vehicle ahead of the subject vehicle. For example, if the system leaves about five meters between the subject vehicle and the leading vehicle in slow moving traffic conditions, the system may allow the subject vehicle to creep forward about 2.5 meters or thereabouts (even though the leading vehicle has not moved forward or has moved forward less than that amount), which would allow sufficient room for the other vehicle to pull into or at least start to pull into the lane behind the crept forward subject vehicle (while still leaving a safe gap between the subject vehicle and the leading vehicle). It would then be up to the vehicle initially behind the subject vehicle to stay put and allow the other vehicle to pull into the lane ahead of them.
  • Optionally, the system of the present invention, when the subject vehicle is in the trailing vehicle position (where a leading vehicle 82 (FIG. 13) creeps forward to make room for another vehicle to pull in behind the leading vehicle), the system may determine when the leading vehicle creeps forward and may apply the brakes so as to not follow the forward movement, so that the other vehicle has sufficient room to pull into the lane ahead of the subject vehicle.
  • Lateral Control:
  • For lateral guidance, good results can be achieved with a state space controller. The system of the present invention controls lateral movement in this way, and its functional structure can be seen in FIG. 15.
  • The concept is based on a state space representation of the controlled system. Herein, the state of the system is expressed by a vector of quantities, which are significant for the dynamic behavior of the system. In this case an expansion of the linear bicycle model leads to a 5-element state vector:

  • x =[{dot over (ψ)},β,δ,r,dy]T  Eq. 4
  • consisting of yaw rate {dot over (ψ)}, slip angle β, front wheel steering angle δ, heading angle between SV and trajectory r and lateral deviation from trajectory y. The basic idea is to control each quantity separately with a proportional controller and by this apply the desired stable dynamic behavior to the closed loop.
  • The avoidance of stationary errors in the lateral deviation from the target trajectory is achieved by an additional parallel integral part for this entity.
  • Following the lateral trajectory requires a certain reference value for every state. The reference values are calculated as a vector x ref in the Reference Input module. The current state x act, based on measurements and estimations, is gathered in the State Observer, closing the feedback loop.
  • The curvature of the trajectory can be modeled as an external disturbance. This can be reduced or eliminated by a feed forward controller based on geometrical calculations, which adding the steering angle δz to compensate. Other measurable disturbances, such as, for example, the road bank angle, may also be eliminated in this manner.
  • The Situation Adaptation is represented by the Parameter Scheduling module, which in every time step delivers a tuned parameter set K. This set is taking into account the situation and in particular the current velocity.
  • Longitudinal Control:
  • The longitudinal controller of the traffic assist system may be implemented as a derivation of a sliding mode controller (SMC). The system's dynamic state with respect to the target is describe in a phase space spanned by the distance x and its derivative {dot over (x)}, which can be measured as the relative velocity. Within this space, the desired dynamic behavior while reaching a target point (in this case x=xfollow, {dot over (x)}=0) is described by a 2-dimensional function (referred as the sliding surface). The location of the current dynamic state in relation to this function determines the acceleration output. These commands are fed into the engine controller (see FIG. 16).
  • The feed forward controller can take into account external disturbances such as the road slope or wind forces. Furthermore, the measured acceleration of the relevant target can be compensated.
  • In a classic SMC, the output may be switched between the minimum and maximum acceleration. This leads to an uncomfortable behavior in a real system, because the actuators are not working infinitely fast (referred as “chattering”). Thus, linear transitions based on the distance of the current state (i.e. point in phase space) in relation to the sliding surface may be implemented to achieve a continuous output. Specific areas, where, for example, the maximum acceleration is applied, are defined by additional surfaces in the phase space as depicted in FIG. 17.
  • The parameters defining the shape and position of the surfaces may be adapted by the Situation Adaptation module, especially to make low-speed driving more comfortable. This virtually adds a third dimension to the phase space.
  • The use of this adapted SMC allows an intuitive specification and parameterization of the longitudinal behavior of the SV. The identical controller can be used for implementation in different vehicles by adapting the subsidiary engine controller to the vehicle characteristics.
  • Unlike a cruise control, a number of factors must be present before the traffic assist system of the present invention can safely activate. Optionally, the system may include an adaptive cruise control (ACC) type of interface. Optionally, for situations when the system does not activate when the driver selects the system, an indicator may be provided with symbols or indicia to indicate to the driver what is preventing the system from operating. For example, if the target vehicle is not detected, the system may highlight a figure of a leading car in yellow. To reduce confusion about the system state and warnings without diverting too much attention from the road, audible or voice alerts may be provided. For example, the activation of the system may be confirmed by the spoken message “Automated Drive Active”.
  • When the system of the present invention is in use, there is a possibility that the increasing degree of automation may cause the driver's alertness and attention to drift more quickly. To keep the driver alert and “in the loop”, precautions may be taken. Optionally, if the system does not detect any driver interaction for a certain period of time, the system may request a certain driver reaction, such as, for example, pulling an ACC lever or input, which may correspond to a re-activation of the traffic driving assist function. The driver may perform this action at any time within the period to reset the time counter. If the driver does not perform the required input within the time period following the request or alert, the demand for this reaction may be achieved by a low priority warning with a text or iconistic or audible message for the driver. If no reaction of the driver is still detected, the system may request the driver to take over the guidance and otherwise will transition into a Safe Mode and slow the vehicle to a stop.
  • Therefore, the present invention provides a traffic jam assist function using a mono-camera as the principle sensor. With a low cost sensor, the system can provide improved driving conditions for many commuters. Optionally, the system may include integrating a detection and response to traffic signs and traffic lights and other road features to allow compliance with traffic rules while driving automated are to be integrated. Automated lane changes may be provided on basis of low cost sensor fusion and environmental modeling. These features, complimented by increased confidence in the system, will allow the system to optionally operate at higher speeds and to operate without another vehicle to follow.
  • Optionally, the present invention may be operable to determine if a vehicle ahead of the subject vehicle changes course as it travels through a “blind” intersection, whereby the system may determine that such a change in course is indicative of a lane shift or an object ahead of the subject vehicle. There are many intersections that are crested. In some cases, it is because one road used to be the through-way while the other had to stop (and now there is a traffic light) or it might be due to coming up a hill and crossing a road that rides along the ridge (such as with some streets of San Francisco).
  • In such intersections, it can be difficult to determine where vehicle is supposed to travel on the other side of the intersection. If the vehicle moves very slowly through the intersection, the vehicle will crest the intersection and the driver can see where he or she is supposed to steer the vehicle. If the driver lives in the area, he or she might know that the other side of the intersection splits into 2 lanes, or jogs a little to one side to make room for a left turn lane on the other side or the like. However, if the driver is unfamiliar and travelling at posted speeds through the intersection, it may be a bit of a surprise to find out the lane has shifted.
  • One of the biggest clues is the vehicle traveling ahead of the equipped or subject vehicle. If the leading vehicle crosses the intersection and moves to one side, the system (via processing of data captured by the forward facing camera or sensor) can use this information as a clue for the self-guided vehicle (in the absence of onboard maps that may show how the road changes at the intersection). The system, responsive to a determination of a shift by the leading vehicle, is operable to adjust the course for the subject vehicle as it crosses the intersection, and may then further adjust the course as the view unfolds as the vehicle continues across the intersection.
  • The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
  • The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an EYEQ2 or EYEQ3 image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
  • The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
  • For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or International Publication Nos. WO 2011/028686; WO 2010/099416; WO 2012/061567; WO 2012/068331; WO 2012/075250; WO 2012/103193; WO 2012/0116043; WO 2012/0145313; WO 2012/0145501; WO 2012/145818; WO 2012/145822; WO 2012/158167; WO 2012/075250; WO 2012/0116043; WO 2012/0145501; WO 2012/154919; WO 2013/019707; WO 2013/016409; WO 2013/019795; WO 2013/067083; WO 2013/070539; WO 2013/043661; WO 2013/048994; WO 2013/063014, WO 2013/081984; WO 2013/081985; WO 2013/074604; WO 2013/086249; WO 2013/103548; WO 2013/109869; WO 2013/123161; WO 2013/126715; WO 2013/043661 and/or WO 2013/158592 and/or PCT Application No. PCT/US2014/042229, filed Jun. 13, 2014, and published Dec. 24, 2014 as International Publication No. WO 2014/204794, and/or U.S. patent application Ser. No. 14/524,203, filed Oct. 27, 2014, now U.S. Pat. No. 9,457,717; Ser. No. 14/519,469, filed Oct. 21, 2014, now U.S. Pat. No. 9,881,220; Ser. No. 14/391,841, filed Oct. 10, 2014, now U.S. Pat. No. 9,751,465; Ser. No. 14/489,659, filed Sep. 18, 2014, and published Apr. 2, 2015 as U.S. Publication No. US-2015-0092042; Ser. No. 14/446,099, filed Aug. 22, 2014, now U.S. Pat. No. 9,343,245; Ser. No. 14/377,940, filed Aug. 11, 2014, and published Jan. 22, 2015 as U.S. Publication No. US-2015-0022665; Ser. No. 14/377,939, filed Aug. 11, 2014, now U.S. Pat. No. 9,871,971; Ser. No. 14/456,164, filed Aug. 11, 2014, now U.S. Pat. No. 9,619,716; Ser. No. 14/456,163, filed Aug. 11, 2014, and published Feb. 12, 2015 as U.S. Publication No. US-2015-0042807; Ser. No. 14/456,162, filed Aug. 11, 2014, and published Feb. 12, 2015 as U.S. Publication No. US-2015-0042806; Ser. No. 14/373,501, filed Jul. 21, 2014, and published Jan. 29, 2015 as U.S. Publication No. US-2015-0028781; Ser. No. 14/372,524, filed Jul. 16, 2014, and published Jan. 22, 2015 as U.S. Publication No. US-2015-0022664; Ser. No. 14/324,696, filed Jul. 7, 2014, now U.S. Pat. No. 9,701,258; Ser. No. 14/316,940, filed Jun. 27, 2014, and published Jan. 8, 2015 as U.S. Publication No. US-2015-0009010; Ser. No. 14/316,939, filed Jun. 27, 2014, and published Jan. 1, 2015 as U.S. Publication No. US-2015-0002670; Ser. No. 14/303,696, filed Jun. 13, 2014, now U.S. Pat. No. 9,609,757; Ser. No. 14/303,695, filed Jun. 13, 2014, and published Dec. 15, 2014 as U.S. Publication No. US-2014-0375476; Ser. No. 14/303,694, filed Jun. 13, 2014, now U.S. Pat. No. 9,260,095; Ser. No. 14/303,693, filed Jun. 13, 2014, and published Dec. 18, 2014 as U.S. Publication No. US-2014-0368654; Ser. No. 14/297,663, filed Jun. 6, 2014, and published Dec. 11, 2014 as U.S. Publication No. US-2014-0362209; Ser. No. 14/362,636, filed Jun. 4, 2014, now U.S. Pat. No. 9,762,880; Ser. No. 14/290,028, filed May 29, 2014, now U.S. Pat. No. 9,800,794; Ser. No. 14/290,026, filed May 29, 2014, now U.S. Pat. No. 9,476,398; Ser. No. 14/282,029, filed May 20, 2014, now U.S. Pat. No. 9,205,776; Ser. No. 14/282,028, filed May 20, 2014, now U.S. Pat. No. 9,563,951; Ser. No. 14/358,232, filed May 15, 2014, now U.S. Pat. No. 9,491,451; Ser. No. 14/272,834, filed May 8, 2014, now U.S. Pat. No. 9,280,202; Ser. No. 14/356,330, filed May 5, 2014, now U.S. Pat. No. 9,604,581; Ser. No. 14/269,788, filed May 5, 2014, now U.S. Pat. No. 9,508,014; Ser. No. 14/268,169, filed May 2, 2014, and published Nov. 6, 2014 as U.S. Publication No. US-2014-0327772; Ser. No. 14/264,443, filed Apr. 29, 2014, and published Oct. 30, 2014 as U.S. Publication No. US-2014-0320636; Ser. No. 14/354,675, filed Apr. 28, 2014, now U.S. Pat. No. 9,580,013; Ser. No. 14/248,602, filed Apr. 9, 2014, now U.S. Pat. No. 9,327,693; Ser. No. 14/242,038, filed Apr. 1, 2014, now U.S. Pat. No. 9,487,159; Ser. No. 14/229,061, filed Mar. 28, 2014, and published Oct. 2, 2014 as U.S. Publication No. US-2014-0293042; Ser. No. 14/343,937, filed Mar. 10, 2014, now U.S. Pat. No. 9,681,062; Ser. No. 14/343,936, filed Mar. 10, 2014, and published Aug. 7, 2014 as U.S. Publication No. US-2014-0218535; Ser. No. 14/195,135, filed Mar. 3, 2014, now U.S. Pat. No. 9,688,200; Ser. No. 14/195,136, filed Mar. 3, 2014, and published Sep. 4, 2014 as U.S. Publication No. US-2014/0247355; Ser. No. 14/191,512, filed Feb. 27, 2014, and published Sep. 4, 2014 as U.S. Publication No. US-2014-0247352; Ser. No. 14/183,613, filed Feb. 19, 2014, now U.S. Pat. No. 9,445,057; Ser. No. 14/169,329, filed Jan. 31, 2014, and published Aug. 7, 2014 as U.S. Publication No. US-2014-0218529; Ser. No. 14/169,328, filed Jan. 31, 2014, now U.S. Pat. No. 9,092,986; Ser. No. 14/163,325, filed Jan. 24, 2014, and published Jul. 31, 2014 as U.S. Publication No. US-2014-0211009; Ser. No. 14/159,772, filed Jan. 21, 2014, now U.S. Pat. No. 9,068,390; Ser. No. 14/107,624, filed Dec. 16, 2013, now U.S. Pat. No. 9,140,789; Ser. No. 14/102,981, filed Dec. 11, 2013, now U.S. Pat. No. 9,558,409; Ser. No. 14/102,980, filed Dec. 11, 2013, and published Jun. 19, 2014 as U.S. Publication No. US-2014-0168437; Ser. No. 14/098,817, filed Dec. 6, 2013, and published Jun. 19, 2014 as U.S. Publication No. US-2014-0168415; Ser. No. 14/097,581, filed Dec. 5, 2013, now U.S. Pat. No. 9,481,301; Ser. No. 14/093,981, filed Dec. 2, 2013, now U.S. Pat. No. 8,917,169; Ser. No. 14/093,980, filed Dec. 2, 2013, and published Jun. 5, 2014 as U.S. Publication No. US-2014-0152825; Ser. No. 14/082,573, filed Nov. 18, 2013, now U.S. Pat. No. 9,743,002; Ser. No. 14/082,574, filed Nov. 18, 2013, now U.S. Pat. No. 9,307,640; Ser. No. 14/082,575, filed Nov. 18, 2013, now U.S. Pat. No. 9,090,234; Ser. No. 14/082,577, filed Nov. 18, 2013, now U.S. Pat. No. 8,818,042; Ser. No. 14/071,086, filed Nov. 4, 2013, now U.S. Pat. No. 8,886,401; Ser. No. 14/076,524, filed Nov. 11, 2013, now U.S. Pat. No. 9,077,962; Ser. No. 14/052,945, filed Oct. 14, 2013, now U.S. Pat. No. 9,707,896; Ser. No. 14/046,174, filed Oct. 4, 2013, now U.S. Pat. No. 9,723,272; Ser. No. 14/036,723, filed Sep. 25, 2013, now U.S. Pat. No. 9,446,713; Ser. No. 14/016,790, filed Sep. 3, 2013, now U.S. Pat. No. 9,761,142; Ser. No. 14/001,272, filed Aug. 23, 2013, now U.S. Pat. No. 9,233,641; Ser. No. 13/970,868, filed Aug. 20, 2013, now U.S. Pat. No. 9,365,162; Ser. No. 13/964,134, filed Aug. 12, 2013, now U.S. Pat. No. 9,340,227; Ser. No. 13/942,758, filed Jul. 16, 2013, and published Jan. 23, 2014 as U.S. Publication No. US-2014-0025240; Ser. No. 13/942,753, filed Jul. 16, 2013, and published Jan. 30, 2014 as U.S. Publication No. US-2014-0028852; Ser. No. 13/927,680, filed Jun. 26, 2013, and published Jan. 2, 2014 as U.S. Publication No. US-2014-0005907; Ser. No. 13/916,051, filed Jun. 12, 2013, now U.S. Pat. No. 9,077,098; Ser. No. 13/894,870, filed May 15, 2013, and published Nov. 28, 2013 as U.S. Publication No. US-2013-0314503; Ser. No. 13/887,724, filed May 6, 2013, now U.S. Pat. No. 9,670,895; Ser. No. 13/852,190, filed Mar. 28, 2013, and published Aug. 29, 2013 as U.S. Publication No. US-2013-0222593; Ser. No. 13/851,378, filed Mar. 27, 2013, now U.S. Pat. No. 9,319,637; Ser. No. 13/848,796, filed Mar. 22, 2012, and published Oct. 24, 2013 as U.S. Publication No. US-2013-0278769; Ser. No. 13/847,815, filed Mar. 20, 2013, and published Oct. 31, 2013 as U.S. Publication No. US-2013-0286193; Ser. No. 13/800,697, filed Mar. 13, 2013, and published Oct. 3, 2013 as U.S. Publication No. US-2013-0258077; Ser. No. 13/785,099, filed Mar. 5, 2013, now U.S. Pat. No. 9,565,342; Ser. No. 13/779,881, filed Feb. 28, 2013, now U.S. Pat. No. 8,694,224; Ser. No. 13/774,317, filed Feb. 22, 2013, now U.S. Pat. No. 9,269,263; Ser. No. 13/774,315, filed Feb. 22, 2013, and published Aug. 22, 2013 as U.S. Publication No. US-2013-0215271; Ser. No. 13/681,963, filed Nov. 20, 2012, now U.S. Pat. No. 9,264,673; Ser. No. 13/660,306, filed Oct. 25, 2012, now U.S. Pat. No. 9,146,898; Ser. No. 13/653,577, filed Oct. 17, 2012, now U.S. Pat. No. 9,174,574; and/or Ser. No. 13/534,657, filed Jun. 27, 2012, and published Jan. 3, 2013 as U.S. Publication No. US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. patent application Ser. No. 13/202,005, filed Aug. 17, 2011, now U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.
  • The imaging device and control and image processor and any associated illumination source, if applicable, may comprise any suitable components, and may utilize aspects of the cameras and vision systems described in U.S. Pat. Nos. 5,550,677; 5,877,897; 6,498,620; 5,670,935; 5,796,094; 6,396,397; 6,806,452; 6,690,268; 7,005,974; 7,937,667; 7,123,168; 7,004,606; 6,946,978; 7,038,577; 6,353,392; 6,320,176; 6,313,454 and/or 6,824,281, and/or International Publication Nos. WO 2010/099416; WO 2011/028686 and/or WO 2013/016409, and/or U.S. Pat. Publication No. US 2010-0020170, and/or U.S. patent application Ser. No. 13/534,657, filed Jun. 27, 2012, and published Jan. 3, 2013 as U.S. Publication No. US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The camera or cameras may comprise any suitable cameras or imaging sensors or camera modules, and may utilize aspects of the cameras or sensors described in U.S. Publication No. US-2009-0244361 and/or U.S. Pat. Nos. 8,542,451; 7,965,336 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties. The imaging array sensor may comprise any suitable sensor, and may utilize various imaging sensors or imaging array sensors or cameras or the like, such as a CMOS imaging array sensor, a CCD sensor or other sensors or the like, such as the types described in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,715,093; 5,877,897; 6,922,292; 6,757,109; 6,717,610; 6,590,719; 6,201,642; 6,498,620; 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 6,806,452; 6,396,397; 6,822,563; 6,946,978; 7,339,149; 7,038,577; 7,004,606; 7,720,580 and/or 7,965,336, and/or International Publication Nos. WO 2009/036176 and/or WO 2009/046268, which are all hereby incorporated herein by reference in their entireties.
  • The camera module and circuit chip or board and imaging sensor may be implemented and operated in connection with various vehicular vision-based systems, and/or may be operable utilizing the principles of such other vehicular systems, such as a vehicle headlamp control system, such as the type disclosed in U.S. Pat. Nos. 5,796,094; 6,097,023; 6,320,176; 6,559,435; 6,831,261; 7,004,606; 7,339,149 and/or 7,526,103, which are all hereby incorporated herein by reference in their entireties, a rain sensor, such as the types disclosed in commonly assigned U.S. Pat. Nos. 6,353,392; 6,313,454; 6,320,176 and/or 7,480,149, which are hereby incorporated herein by reference in their entireties, a vehicle vision system, such as a forwardly, sidewardly or rearwardly directed vehicle vision system utilizing principles disclosed in U.S. Pat. Nos. 5,550,677; 5,670,935; 5,760,962; 5,877,897; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978 and/or 7,859,565, which are all hereby incorporated herein by reference in their entireties, a trailer hitching aid or tow check system, such as the type disclosed in U.S. Pat. No. 7,005,974, which is hereby incorporated herein by reference in its entirety, a reverse or sideward imaging system, such as for a lane change assistance system or lane departure warning system or for a blind spot or object detection system, such as imaging or detection systems of the types disclosed in U.S. Pat. Nos. 7,881,496; 7,720,580; 7,038,577; 5,929,786 and/or 5,786,772, which are hereby incorporated herein by reference in their entireties, a video device for internal cabin surveillance and/or video telephone function, such as disclosed in U.S. Pat. Nos. 5,760,962; 5,877,897; 6,690,268 and/or 7,370,983, and/or U.S. patent application Ser. No. 10/538,724, filed Jun. 13, 2005 and published Mar. 9, 2006 as U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties, a traffic sign recognition system, a system for determining a distance to a leading or trailing vehicle or object, such as a system utilizing the principles disclosed in U.S. Pat. Nos. 6,396,397 and/or 7,123,168, which are hereby incorporated herein by reference in their entireties, and/or the like.
  • Optionally, the circuit board or chip may include circuitry for the imaging array sensor and or other electronic accessories or features, such as by utilizing compass-on-a-chip or EC driver-on-a-chip technology and aspects such as described in U.S. Pat. Nos. 7,255,451 and/or 7,480,149 and/or U.S. Publication No. US-2006-0061008 and/or U.S. patent application Ser. No. 12/578,732, filed Oct. 14, 2009, now U.S. Pat. No. 9,487,144, which are hereby incorporated herein by reference in their entireties.
  • Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device disposed at or in the interior rearview mirror assembly of the vehicle, such as by utilizing aspects of the video mirror display systems described in U.S. Pat. No. 6,690,268 and/or U.S. patent application Ser. No. 13/333,337, filed Dec. 21, 2011, now U.S. Pat. No. 9,264,672, which are hereby incorporated herein by reference in their entireties. The video mirror display may comprise any suitable devices and systems and optionally may utilize aspects of the compass display systems described in U.S. Pat. Nos. 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252 and/or 6,642,851, and/or European patent application, published Oct. 11, 2000 under Publication No. EP 0 1043566, and/or U.S. Publication No. US-2006-0061008, which are all hereby incorporated herein by reference in their entireties. Optionally, the video mirror display screen or device may be operable to display images captured by a rearward viewing camera of the vehicle during a reversing maneuver of the vehicle (such as responsive to the vehicle gear actuator being placed in a reverse gear position or the like) to assist the driver in backing up the vehicle, and optionally may be operable to display the compass heading or directional heading character or icon when the vehicle is not undertaking a reversing maneuver, such as when the vehicle is being driven in a forward direction along a road (such as by utilizing aspects of the display system described in International Publication No. WO 2012/051500, which is hereby incorporated herein by reference in its entirety).
  • Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO 2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249 and/or WO 2013/109869, which are hereby incorporated herein by reference in their entireties.
  • Optionally, a video mirror display may be disposed rearward of and behind the reflective element assembly and may comprise a display such as the types disclosed in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,370,983; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187 and/or 6,690,268, and/or in U.S. Publication Nos. US-2006-0061008 and/or US-2006-0050018, which are all hereby incorporated herein by reference in their entireties. The display is viewable through the reflective element when the display is activated to display information. The display element may be any type of display element, such as a vacuum fluorescent (VF) display element, a light emitting diode (LED) display element, such as an organic light emitting diode (OLED) or an inorganic light emitting diode, an electroluminescent (EL) display element, a liquid crystal display (LCD) element, a video screen display element or backlit thin film transistor (TFT) display element or the like, and may be operable to display various information (as discrete characters, icons or the like, or in a multi-pixel manner) to the driver of the vehicle, such as passenger side inflatable restraint (PSIR) information, tire pressure status, and/or the like. The mirror assembly and/or display may utilize aspects described in U.S. Pat. Nos. 7,184,190; 7,255,451; 7,446,924 and/or 7,338,177, which are all hereby incorporated herein by reference in their entireties. The thicknesses and materials of the coatings on the substrates of the reflective element may be selected to provide a desired color or tint to the mirror reflective element, such as a blue colored reflector, such as is known in the art and such as described in U.S. Pat. Nos. 5,910,854; 6,420,036 and/or 7,274,501, which are hereby incorporated herein by reference in their entireties.
  • Optionally, the display or displays and any associated user inputs may be associated with various accessories or systems, such as, for example, a tire pressure monitoring system or a passenger air bag status or a garage door opening system or a telematics system or any other accessory or system of the mirror assembly or of the vehicle or of an accessory module or console of the vehicle, such as an accessory module or console of the types described in U.S. Pat. Nos. 7,289,037; 6,877,888; 6,824,281; 6,690,268; 6,672,744; 6,386,742 and/or 6,124,886, and/or U.S. Publication No. US-2006-0050018, which are hereby incorporated herein by reference in their entireties.
  • Changes and modifications to the specifically described embodiments may be carried out without departing from the principles of the present invention, which is intended to be limited only by the scope of the appended claims as interpreted according to the principles of patent law.

Claims (20)

1. A vehicular control system, the vehicular control system comprising:
a camera disposed at an in-cabin side of a windshield of a vehicle equipped with the vehicular control system, the camera viewing through the windshield and forward of the equipped vehicle;
a control comprising an image processor operable to process image data captured by the camera;
wherein, with the equipped vehicle moving in a forward direction along a road, the control, via image processing at the image processor of image data captured by the camera, determines presence of a pedestrian ahead of the vehicle and in the field of view of the camera;
wherein the control, via image processing at the image processor of image data captured by the camera, determines if the pedestrian present ahead of the equipped vehicle is moving across a path of travel of the equipped vehicle; and
wherein the control, at least in part responsive to determination that the pedestrian is moving across the path of travel of the equipped vehicle, reduces forward speed of the equipped vehicle to allow the pedestrian to move out of the path of travel of the forward moving equipped vehicle.
2. The vehicular control system of claim 1, wherein the control, at least in part responsive to determination that the pedestrian is moving across the path of travel of the equipped vehicle, reduces forward speed of the equipped vehicle to less than or equal to 2 m/s.
3. The vehicular control system of claim 1, wherein the control, at least in part responsive to determination that the pedestrian is moving across the path of travel of the equipped vehicle, determines a predicted path of the pedestrian and maneuvers the equipped vehicle to avoid collision with the pedestrian.
4. The vehicular control system of claim 3, wherein the control autonomously maneuvers the equipped vehicle to avoid collision with the pedestrian.
5. The vehicular control system of claim 4, wherein the control autonomously maneuvers the equipped vehicle by controlling a steering system of the equipped vehicle, a braking system of the equipped vehicle and an accelerator of the equipped vehicle.
6. The vehicular control system of claim 4, wherein the control provides autonomous maneuvering of the equipped vehicle when the forward speed of the equipped vehicle is less than 25 kph.
7. The vehicular control system of claim 1, wherein the control, via image processing at the image processor of image data captured by the camera, determines if the pedestrian present ahead of the equipped vehicle and in the field of view of the camera is outside of the path of travel of the equipped vehicle and moving toward the path of travel of the equipped vehicle, and wherein the control, responsive to determination that the pedestrian present ahead of the equipped vehicle and in the field of view of the camera is outside of the path of travel of the equipped vehicle and moving toward the path of travel of the equipped vehicle, reduces forward speed of the equipped vehicle to allow the pedestrian to move into and across the path of travel of the forward moving equipped vehicle.
8. The vehicular control system of claim 1, wherein the control system determines a traffic condition that the equipped vehicle is traveling in, and wherein, responsive to determination of the traffic condition, the control system is operable to control a steering system of the equipped vehicle.
9. The vehicular control system of claim 1, wherein the control, via image processing at the image processor of image data captured by the camera, determines lane markings on the road being traveled by the equipped vehicle.
10. The vehicular control system of claim 9, wherein the control determines a leading vehicle ahead of the equipped vehicle, and wherein the control controls a steering system of the equipped vehicle to follow the determined leading vehicle irrespective of the determined lane markings on the road being traveled.
11. The vehicular control system of claim 10, wherein the control system, at least in part responsive to detection of another vehicle indicating an intent to change lanes into the occupied lane ahead of the equipped vehicle, controls a braking system of the equipped vehicle to decelerate the equipped vehicle to allow for the lane change by the other vehicle.
12. A vehicular control system, the vehicular control system comprising:
a camera disposed at an in-cabin side of a windshield of a vehicle equipped with the vehicular control system, the camera viewing through the windshield and forward of the equipped vehicle;
a control comprising an image processor operable to process image data captured by the camera;
wherein, with the equipped vehicle moving in a forward direction along a road, the control, via image processing at the image processor of image data captured by the camera, determines presence of a pedestrian ahead of the vehicle and in the field of view of the camera;
wherein the control, via image processing at the image processor of image data captured by the camera, determines if the pedestrian present ahead of the equipped vehicle is stationary in the path of travel of the equipped vehicle;
wherein the control, at least in part responsive to determination that the pedestrian is stationary in the path of travel of the equipped vehicle, stops the equipped vehicle; and
wherein the control, responsive to determining that the pedestrian has moved out of the path of travel of the equipped vehicle, resumes forward travel.
13. The vehicular control system of claim 12, wherein the control system determines a traffic condition that the equipped vehicle is traveling in, and wherein, responsive to determination of the traffic condition, the control system is operable to control a steering system of the equipped vehicle.
14. The vehicular control system of claim 12, wherein the control, via image processing at the image processor of image data captured by the camera, determines lane markings on the road being traveled by the equipped vehicle.
15. The vehicular control system of claim 14, wherein the control determines a leading vehicle ahead of the equipped vehicle, and wherein the control controls a steering system of the equipped vehicle to follow the determined leading vehicle irrespective of the determined lane markings on the road being traveled.
16. The vehicular control system of claim 15, wherein the control system, at least in part responsive to detection of another vehicle indicating an intent to change lanes into the occupied lane ahead of the equipped vehicle, controls a braking system of the equipped vehicle to decelerate the equipped vehicle to allow for the lane change by the other vehicle.
17. A vehicular control system, the vehicular control system comprising:
a camera disposed at an in-cabin side of a windshield of a vehicle equipped with the vehicular control system, the camera viewing through the windshield and forward of the equipped vehicle;
a control comprising an image processor operable to process image data captured by the camera;
wherein, with the equipped vehicle moving in a forward direction along a road, the control, via image processing at the image processor of image data captured by the camera, determines presence of a pedestrian ahead of the vehicle and in the field of view of the camera;
wherein the control, via image processing at the image processor of image data captured by the camera, determines if the pedestrian present ahead of the equipped vehicle (i) is stationary in the path of travel of the equipped vehicle, (ii) is moving across the path of travel of the equipped vehicle or (iii) is outside of the path of travel of the equipped vehicle and moving toward the path of travel of the equipped vehicle;
wherein the control, at least in part responsive to determination that the pedestrian is stationary in the path of travel of the equipped vehicle, stops the equipped vehicle;
wherein the control, at least in part responsive to determination that the pedestrian is moving across the path of travel of the equipped vehicle, reduces forward speed of the equipped vehicle to allow the pedestrian to move out of the path of travel of the forward moving equipped vehicle; and
wherein the control, responsive to determination that the pedestrian present ahead of the equipped vehicle is outside of the path of travel of the equipped vehicle and moving toward the path of travel of the equipped vehicle, reduces forward speed of the equipped vehicle to allow the pedestrian to move into and across the path of travel of the forward moving equipped vehicle.
18. The vehicular control system of claim 17, wherein the control, after the control has stopped the equipped vehicle responsive to determination that the pedestrian is stationary in the path of travel of the equipped vehicle, and responsive to determining that the pedestrian has moved out of the path of travel of the equipped vehicle, resumes forward travel.
19. The vehicular control system of claim 17, wherein the control, at least in part responsive to determination that the pedestrian is moving across the path of travel of the equipped vehicle, reduces forward speed of the equipped vehicle to less than or equal to 2 m/s.
20. The vehicular control system of claim 17, wherein the control, at least in part responsive to determination that the pedestrian is moving across the path of travel of the equipped vehicle, determines a predicted path of the pedestrian and maneuvers the equipped vehicle to avoid collision with the pedestrian.
US16/946,434 2013-12-12 2020-06-22 Vehicular control system with pedestrian avoidance Pending US20200324764A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/946,434 US20200324764A1 (en) 2013-12-12 2020-06-22 Vehicular control system with pedestrian avoidance

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201361915218P 2013-12-12 2013-12-12
US201361919133P 2013-12-20 2013-12-20
US201461953970P 2014-03-17 2014-03-17
US14/568,177 US9988047B2 (en) 2013-12-12 2014-12-12 Vehicle control system with traffic driving control
US15/996,727 US10688993B2 (en) 2013-12-12 2018-06-04 Vehicle control system with traffic driving control
US16/946,434 US20200324764A1 (en) 2013-12-12 2020-06-22 Vehicular control system with pedestrian avoidance

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/996,727 Continuation US10688993B2 (en) 2013-12-12 2018-06-04 Vehicle control system with traffic driving control

Publications (1)

Publication Number Publication Date
US20200324764A1 true US20200324764A1 (en) 2020-10-15

Family

ID=53367459

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/568,177 Active 2035-05-12 US9988047B2 (en) 2013-12-12 2014-12-12 Vehicle control system with traffic driving control
US15/996,727 Active 2035-06-01 US10688993B2 (en) 2013-12-12 2018-06-04 Vehicle control system with traffic driving control
US16/946,434 Pending US20200324764A1 (en) 2013-12-12 2020-06-22 Vehicular control system with pedestrian avoidance

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US14/568,177 Active 2035-05-12 US9988047B2 (en) 2013-12-12 2014-12-12 Vehicle control system with traffic driving control
US15/996,727 Active 2035-06-01 US10688993B2 (en) 2013-12-12 2018-06-04 Vehicle control system with traffic driving control

Country Status (1)

Country Link
US (3) US9988047B2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10974649B2 (en) * 2017-10-02 2021-04-13 Magna Electronics Inc. Vehicular parking assist system using backup camera
CN113110069A (en) * 2021-05-24 2021-07-13 武汉大学 Iterative neural network robust control method based on magnetic suspension planar motor
US20210341927A1 (en) * 2018-11-09 2021-11-04 Waymo Llc Verifying Predicted Trajectories Using A Grid-Based Approach
WO2023172406A1 (en) * 2022-03-08 2023-09-14 Georgia Tech Research Corporation Systems and methods for vehicle signaling and bargaining

Families Citing this family (217)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8606512B1 (en) 2007-05-10 2013-12-10 Allstate Insurance Company Route risk mitigation
US10157422B2 (en) 2007-05-10 2018-12-18 Allstate Insurance Company Road segment safety rating
US9932033B2 (en) 2007-05-10 2018-04-03 Allstate Insurance Company Route risk mitigation
US10096038B2 (en) 2007-05-10 2018-10-09 Allstate Insurance Company Road segment safety rating system
DE102012212065A1 (en) * 2012-07-11 2014-01-16 Robert Bosch Gmbh Method for operating a driver assistance system for a vehicle and driver assistance system for a vehicle
JP6145508B2 (en) * 2012-08-16 2017-06-14 ジャガー ランド ローバー リミテッドJaguar Land Rover Limited System and method for controlling vehicle speed to improve occupant comfort
WO2014137251A1 (en) * 2013-03-06 2014-09-12 Volvo Truck Corporation Method for calculating a desired yaw rate for a vehicle
DE102013009424A1 (en) * 2013-06-04 2014-12-04 Volkswagen Aktiengesellschaft Emergency assistance without activated lateral guidance assistance
DE102013212318A1 (en) * 2013-06-26 2014-12-31 Bayerische Motoren Werke Aktiengesellschaft Automated parking procedure with additional correction train
US9988047B2 (en) 2013-12-12 2018-06-05 Magna Electronics Inc. Vehicle control system with traffic driving control
US9355423B1 (en) 2014-01-24 2016-05-31 Allstate Insurance Company Reward system related to a vehicle-to-vehicle communication system
US9390451B1 (en) 2014-01-24 2016-07-12 Allstate Insurance Company Insurance system related to a vehicle-to-vehicle communication system
US10096067B1 (en) 2014-01-24 2018-10-09 Allstate Insurance Company Reward system related to a vehicle-to-vehicle communication system
US10803525B1 (en) 2014-02-19 2020-10-13 Allstate Insurance Company Determining a property of an insurance policy based on the autonomous features of a vehicle
US10783587B1 (en) * 2014-02-19 2020-09-22 Allstate Insurance Company Determining a driver score based on the driver's response to autonomous features of a vehicle
US9940676B1 (en) 2014-02-19 2018-04-10 Allstate Insurance Company Insurance system for analysis of autonomous driving
US10796369B1 (en) 2014-02-19 2020-10-06 Allstate Insurance Company Determining a property of an insurance policy based on the level of autonomy of a vehicle
US10783586B1 (en) 2014-02-19 2020-09-22 Allstate Insurance Company Determining a property of an insurance policy based on the density of vehicles
TW201538369A (en) * 2014-04-09 2015-10-16 Papago Inc Traffic lane deviation warning system and method by combination with driving behavior
DE102014208524A1 (en) * 2014-05-07 2015-11-12 Robert Bosch Gmbh LOCAL TRANSPORTATION ANALYSIS WITH DETECTION OF A TRAFFIC PATH
JP6103716B2 (en) * 2014-06-17 2017-03-29 富士重工業株式会社 Vehicle travel control device
US9925980B2 (en) 2014-09-17 2018-03-27 Magna Electronics Inc. Vehicle collision avoidance system with enhanced pedestrian avoidance
KR101664582B1 (en) * 2014-11-12 2016-10-10 현대자동차주식회사 Path Planning Apparatus and Method for Autonomous Vehicle
US9278689B1 (en) * 2014-11-13 2016-03-08 Toyota Motor Engineering & Manufacturing North America, Inc. Autonomous vehicle detection of and response to emergency vehicles
JP2016103194A (en) * 2014-11-28 2016-06-02 パナソニックIpマネジメント株式会社 Vehicle travel support system and vehicle travel support method
EP3043289B1 (en) * 2015-01-07 2023-04-19 Honda Research Institute Europe GmbH Control system for an autonomous vehicle and a method for generating a control signal and autonomous vehicle equipped with such control system
JP6528690B2 (en) * 2015-02-10 2019-06-12 株式会社デンソー Save control device, save control method
EP3885217A1 (en) * 2015-02-10 2021-09-29 Mobileye Vision Technologies Ltd. Sparse map for autonomous vehicle navigation
DE102015203619A1 (en) * 2015-02-28 2016-09-01 Bayerische Motoren Werke Aktiengesellschaft Parking assistance system with detection of a universal parking space
EP3279052A4 (en) * 2015-03-31 2018-12-05 Hitachi Automotive Systems, Ltd. Automatic driving control device
DE102015004478A1 (en) * 2015-04-07 2016-10-13 Lucas Automotive Gmbh A control system and method for enabling a shunting of another motor vehicle from a neighboring lane in the ACC operation of the own motor vehicle
DE102015207025A1 (en) * 2015-04-17 2016-10-20 Bayerische Motoren Werke Aktiengesellschaft Driver assistance system in a motor vehicle
JP6384400B2 (en) * 2015-05-18 2018-09-05 株式会社デンソー Collision avoidance device
KR102366402B1 (en) * 2015-05-21 2022-02-22 엘지전자 주식회사 Driver assistance apparatus and control method for the same
JP6396850B2 (en) * 2015-05-29 2018-09-26 株式会社デンソー Driving support device and driving support method
US9478137B1 (en) * 2015-06-17 2016-10-25 Ford Global Technologies, Llc Detecting and communicating lane splitting maneuver
US9696723B2 (en) * 2015-06-23 2017-07-04 GM Global Technology Operations LLC Smart trailer hitch control using HMI assisted visual servoing
US10514698B2 (en) * 2015-07-27 2019-12-24 Nissan Motor Co., Ltd. Route guidance device and route guidance method
JP6532786B2 (en) * 2015-08-07 2019-06-19 株式会社日立製作所 Vehicle travel control device and speed control method
US11691619B2 (en) 2015-08-12 2023-07-04 Hyundai Motor Company Automatic parking system and automatic parking method
US10392009B2 (en) 2015-08-12 2019-08-27 Hyundai Motor Company Automatic parking system and automatic parking method
KR101704244B1 (en) * 2015-08-12 2017-02-22 현대자동차주식회사 Remote Parking Method and Apparatus
US9764736B2 (en) * 2015-08-14 2017-09-19 Toyota Motor Engineering & Manufacturing North America, Inc. Autonomous vehicle operation relative to unexpected dynamic objects
US10234859B2 (en) * 2015-08-20 2019-03-19 Harman International Industries, Incorporated Systems and methods for driver assistance
DE102015216152A1 (en) * 2015-08-25 2017-03-02 Conti Temic Microelectronic Gmbh Lane keeping assistance system, motor vehicle with such a lane keeping assistance device and a method for lane keeping monitoring
US20170057517A1 (en) * 2015-09-01 2017-03-02 Continental Automotive Systems, Inc. Behavior trainable adaptive cruise control
DE102015216881A1 (en) * 2015-09-03 2017-03-09 Robert Bosch Gmbh Method and device for driverless driving a motor vehicle within a parking lot
US11162793B2 (en) * 2015-09-15 2021-11-02 Here Global B.V. Method and apparatus for autonomous navigation speed at intersections
EP3144919B1 (en) * 2015-09-18 2020-06-24 Continental Automotive GmbH Device and method for start assistance for a motor vehicle
US9721472B2 (en) * 2015-09-22 2017-08-01 Ford Global Technologies, Llc Formulating lane level routing plans
US10315651B2 (en) 2015-10-13 2019-06-11 Magna Electronics Inc. Vehicle vision system with lateral control algorithm for lane keeping
DE112015006932T5 (en) * 2015-10-20 2018-06-21 Ford Global Technologies, Llc Supporting meandering of motorcycles
JP6728634B2 (en) * 2015-11-04 2020-07-22 株式会社リコー Detecting device, detecting method and program
US9983591B2 (en) * 2015-11-05 2018-05-29 Ford Global Technologies, Llc Autonomous driving at intersections based on perception data
JP6485328B2 (en) * 2015-11-09 2019-03-20 株式会社デンソー Vehicle driving support device
US10144419B2 (en) 2015-11-23 2018-12-04 Magna Electronics Inc. Vehicle dynamic control system for emergency handling
US10407047B2 (en) * 2015-12-07 2019-09-10 Magna Electronics Inc. Vehicle control system with target vehicle trajectory tracking
US9956956B2 (en) * 2016-01-11 2018-05-01 Denso Corporation Adaptive driving system
US10030978B2 (en) * 2016-01-17 2018-07-24 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for detection of surrounding vehicle lane departure
US10474964B2 (en) * 2016-01-26 2019-11-12 Ford Global Technologies, Llc Training algorithm for collision avoidance
JP6330825B2 (en) * 2016-01-26 2018-05-30 トヨタ自動車株式会社 Vehicle collision avoidance support system
US10269075B2 (en) 2016-02-02 2019-04-23 Allstate Insurance Company Subjective route risk mapping and mitigation
US9802599B2 (en) * 2016-03-08 2017-10-31 Ford Global Technologies, Llc Vehicle lane placement
JP6298092B2 (en) * 2016-03-18 2018-03-20 株式会社Subaru Vehicle stop device
US20190051156A1 (en) * 2016-03-21 2019-02-14 Ford Global Technologies, Llc Propagation of alerts regarding traffic events
CN109070888A (en) * 2016-03-21 2018-12-21 福特全球技术公司 Propagate the alarm about traffic events
EP3286056B1 (en) * 2016-03-23 2021-01-06 Deutsche Telekom AG System and method for a full lane change aid system with augmented reality technology
US10077052B2 (en) * 2016-03-31 2018-09-18 Faraday&Future Inc. State-based operation for autonomous vehicles
JP6668895B2 (en) * 2016-04-01 2020-03-18 株式会社デンソー Driving support device
US9896096B2 (en) * 2016-04-11 2018-02-20 David E. Newman Systems and methods for hazard mitigation
JP6418199B2 (en) 2016-04-27 2018-11-07 トヨタ自動車株式会社 Automatic vehicle driving system
JP6418407B2 (en) * 2016-05-06 2018-11-07 トヨタ自動車株式会社 Brake control device for vehicle
US9829889B1 (en) 2016-05-10 2017-11-28 Toyota Motor Engineering & Manufacturing North America, Inc. Autonomous vehicle advanced notification system and method of use
US9922563B2 (en) * 2016-05-24 2018-03-20 International Business Machines Corporation Dynamic cross-lane travel path determination by self-driving vehicles
EP3254919B1 (en) * 2016-06-07 2021-10-06 Volvo Car Corporation Adaptive cruise control system and vehicle comprising an adaptive cruise control system
WO2017216856A1 (en) * 2016-06-14 2017-12-21 日産自動車株式会社 Inter-vehicle distance estimation method and inter-vehicle distance estimation device
US9840253B1 (en) * 2016-06-14 2017-12-12 Delphi Technologies, Inc. Lane keeping system for autonomous vehicle during camera drop-outs
CA3028933C (en) * 2016-06-27 2019-09-10 Nissan Motor Co., Ltd. Vehicle control method and vehicle control device
US10093311B2 (en) 2016-07-06 2018-10-09 Waymo Llc Testing predictions for autonomous vehicles
US10828954B2 (en) * 2016-07-13 2020-11-10 Ford Global Technologies, Llc Ride performance optimization systems and devices, and related methods
JP6520862B2 (en) * 2016-08-10 2019-05-29 トヨタ自動車株式会社 Automatic driving system
JP6572847B2 (en) 2016-08-10 2019-09-11 トヨタ自動車株式会社 Automated driving system
US10543852B2 (en) * 2016-08-20 2020-01-28 Toyota Motor Engineering & Manufacturing North America, Inc. Environmental driver comfort feedback for autonomous vehicle
JP6729220B2 (en) * 2016-09-08 2020-07-22 トヨタ自動車株式会社 Vehicle driving support device
DE102016217770A1 (en) * 2016-09-16 2018-03-22 Audi Ag Method for operating a motor vehicle
US10210760B2 (en) * 2016-09-21 2019-02-19 Dura Operating, Llc System and method for autonomous parking of a vehicle
DE102016219594A1 (en) * 2016-10-10 2018-04-12 Volkswagen Aktiengesellschaft Method and device for driving dynamics control for a motor vehicle
US10202118B2 (en) 2016-10-14 2019-02-12 Waymo Llc Planning stopping locations for autonomous vehicles
US10594934B2 (en) * 2016-11-17 2020-03-17 Bendix Commercial Vehicle Systems Llc Vehicle display
CN109923597A (en) * 2016-11-18 2019-06-21 三菱电机株式会社 Drive assistance device and driving assistance method
JP6616275B2 (en) * 2016-12-15 2019-12-04 株式会社Soken Driving assistance device
JP6809890B2 (en) * 2016-12-15 2021-01-06 日立オートモティブシステムズ株式会社 Vehicle control device
US10150474B2 (en) * 2017-01-04 2018-12-11 Robert Bosch Gmbh Reducing lateral position deviation during an automated lane change
WO2018132608A2 (en) * 2017-01-12 2018-07-19 Mobileye Vision Technologies Ltd. Navigation based on occlusion zones
US11318952B2 (en) * 2017-01-24 2022-05-03 Ford Global Technologies, Llc Feedback for an autonomous vehicle
DE102017202415A1 (en) * 2017-02-15 2018-08-16 Bayerische Motoren Werke Aktiengesellschaft Collision avoidance with cross traffic
JP6497818B2 (en) 2017-03-10 2019-04-10 株式会社Subaru Image display device
JP6465317B2 (en) 2017-03-10 2019-02-06 株式会社Subaru Image display device
JP6593803B2 (en) 2017-03-10 2019-10-23 株式会社Subaru Image display device
JP6497819B2 (en) 2017-03-10 2019-04-10 株式会社Subaru Image display device
JP6465318B2 (en) 2017-03-10 2019-02-06 株式会社Subaru Image display device
JP6515125B2 (en) 2017-03-10 2019-05-15 株式会社Subaru Image display device
JP6429413B2 (en) * 2017-03-10 2018-11-28 株式会社Subaru Image display device
US10430968B2 (en) * 2017-03-14 2019-10-01 Ford Global Technologies, Llc Vehicle localization using cameras
EP3611077B1 (en) * 2017-04-14 2022-02-09 Nissan Motor Co., Ltd. Vehicle control method and vehicle control device
US10509409B2 (en) * 2017-04-27 2019-12-17 Aptiv Technologies Limited Local traffic customs learning system for automated vehicle
US10373501B2 (en) * 2017-05-10 2019-08-06 Aptiv Technologies Limited Automated vehicle control strategy for pedestrian crowds
US10501074B2 (en) * 2017-06-09 2019-12-10 Robert Bosch Gmbh Methods and systems for reducing vehicle and animal collisions
EP3413082B1 (en) * 2017-06-09 2020-01-01 Veoneer Sweden AB A vehicle system for detection of oncoming vehicles
DE102017209846B4 (en) * 2017-06-12 2019-02-07 Ford Global Technologies, Llc A vehicle having an adaptive override emergency brake system and method for adjusting an emergency brake override threshold
EP3437022A4 (en) * 2017-06-22 2019-02-06 Baidu.com Times Technology (Beijing) Co., Ltd. Traffic prediction based on map images for autonomous driving
CN107330921A (en) * 2017-06-28 2017-11-07 京东方科技集团股份有限公司 A kind of line-up device and its queuing control method
US20190027034A1 (en) * 2017-07-19 2019-01-24 Aptiv Technologies Limited Variable steering error limits for automated vehicle control
KR20190012370A (en) * 2017-07-27 2019-02-11 삼성에스디에스 주식회사 Method and Apparatus for lane change support
US10406438B2 (en) * 2017-07-28 2019-09-10 Microsoft Technology Licensing, Llc Controlling behavior of entities in funnel sections of a computer-represented environment
US10275043B2 (en) 2017-08-23 2019-04-30 Ford Global Technologies, Llc Detection of lane conditions in adaptive cruise control systems
JP6664360B2 (en) * 2017-09-08 2020-03-13 本田技研工業株式会社 Judgment device and vehicle
JP6765357B2 (en) * 2017-09-14 2020-10-07 本田技研工業株式会社 Driving control device, driving control method and program
JP7184151B2 (en) * 2017-09-19 2022-12-06 スズキ株式会社 Vehicle travel control device
US11193780B2 (en) * 2017-09-19 2021-12-07 Continental Automotive Systems, Inc. Vehicle safety system and method for providing a recommended path
WO2019064490A1 (en) * 2017-09-29 2019-04-04 本田技研工業株式会社 Vehicle control device, vehicle control method, and program
US10217354B1 (en) * 2017-10-02 2019-02-26 Bertram V Burke Move over slow drivers cell phone technology
JP6933080B2 (en) * 2017-10-05 2021-09-08 いすゞ自動車株式会社 Vehicle speed control device
KR102070605B1 (en) * 2017-10-27 2020-03-02 주식회사 만도 Autonomous emergency braking system and method by predicting circumstances surrounding vehicle
WO2019089591A1 (en) * 2017-10-30 2019-05-09 Mobileye Vision Technologies Ltd. Vehicle navigation based on human activity
GB2568060B (en) * 2017-11-02 2020-02-12 Jaguar Land Rover Ltd Controller for a vehicle
JP6926957B2 (en) * 2017-11-02 2021-08-25 トヨタ自動車株式会社 Lane change support device
US10836386B2 (en) * 2017-11-10 2020-11-17 GM Global Technology Operations LLC Determination of roll angle and bank angle with suspension displacement data
DE102017221619A1 (en) * 2017-11-30 2019-06-06 Volkswagen Aktiengesellschaft Method and device for indicating a feasibility of an at least partially automatically feasible driving maneuver in a vehicle
US10293819B1 (en) * 2017-12-19 2019-05-21 Trw Automotive U.S. Llc Autonomous roadway merge assist system
RU2683618C1 (en) * 2017-12-21 2019-03-29 Общество с ограниченной ответственностью "Фирма "ТЕСА" System for determination of the actual parameters of a carriageway
EP3744598A4 (en) * 2018-01-23 2021-03-03 Nissan Motor Co., Ltd. Vehicle control method and vehicle control system
US10304341B1 (en) * 2018-02-09 2019-05-28 International Business Machines Corporation Vehicle and bicycle communication to avoid vehicle door crash accidents
CN111788104B (en) * 2018-03-07 2023-09-26 罗伯特·博世有限公司 Known lane spacing for automated driving
JP7108916B2 (en) * 2018-03-13 2022-07-29 パナソニックIpマネジメント株式会社 vehicle controller
EP3540710A1 (en) * 2018-03-14 2019-09-18 Honda Research Institute Europe GmbH Method for assisting operation of an ego-vehicle, method for assisting other traffic participants and corresponding assistance systems and vehicles
US11077854B2 (en) 2018-04-11 2021-08-03 Hyundai Motor Company Apparatus for controlling lane change of vehicle, system having the same and method thereof
EP3552902A1 (en) 2018-04-11 2019-10-16 Hyundai Motor Company Apparatus and method for providing a driving path to a vehicle
EP3569460B1 (en) 2018-04-11 2024-03-20 Hyundai Motor Company Apparatus and method for controlling driving in vehicle
US10843710B2 (en) 2018-04-11 2020-11-24 Hyundai Motor Company Apparatus and method for providing notification of control authority transition in vehicle
US11351989B2 (en) 2018-04-11 2022-06-07 Hyundai Motor Company Vehicle driving controller, system including the same, and method thereof
US11597403B2 (en) 2018-04-11 2023-03-07 Hyundai Motor Company Apparatus for displaying driving state of vehicle, system including the same and method thereof
EP3552913B1 (en) 2018-04-11 2021-08-18 Hyundai Motor Company Apparatus and method for controlling to enable autonomous system in vehicle
US11548509B2 (en) 2018-04-11 2023-01-10 Hyundai Motor Company Apparatus and method for controlling lane change in vehicle
US11084491B2 (en) 2018-04-11 2021-08-10 Hyundai Motor Company Apparatus and method for providing safety strategy in vehicle
US11084490B2 (en) 2018-04-11 2021-08-10 Hyundai Motor Company Apparatus and method for controlling drive of vehicle
US10836394B2 (en) * 2018-04-11 2020-11-17 Hyundai Motor Company Apparatus and method for lane change control
EP3552901A3 (en) 2018-04-11 2020-04-29 Hyundai Motor Company Apparatus and method for providing safety strategy in vehicle
US11334067B2 (en) 2018-04-11 2022-05-17 Hyundai Motor Company Apparatus and method for providing safety strategy in vehicle
US11173910B2 (en) 2018-04-11 2021-11-16 Hyundai Motor Company Lane change controller for vehicle system including the same, and method thereof
US10678249B2 (en) 2018-04-20 2020-06-09 Honda Motor Co., Ltd. System and method for controlling a vehicle at an uncontrolled intersection with curb detection
US11572099B2 (en) 2018-04-27 2023-02-07 Honda Motor Co., Ltd. Merge behavior systems and methods for merging vehicles
US11117584B2 (en) 2018-04-27 2021-09-14 Honda Motor Co., Ltd. Merge behavior systems and methods for mainline vehicles
CN108875603B (en) * 2018-05-31 2021-06-04 上海商汤智能科技有限公司 Intelligent driving control method and device based on lane line and electronic equipment
FR3082162B1 (en) * 2018-06-11 2020-06-05 Renault S.A.S METHOD AND DEVICE FOR DEVELOPING A CLOSED LOOP OF AN ADVANCED DRIVING AID DEVICE
JP7172172B2 (en) * 2018-06-22 2022-11-16 株式会社デンソー vehicle controller
US10778901B2 (en) 2018-06-27 2020-09-15 Aptiv Technologies Limited Camera adjustment system
DE102018005261A1 (en) * 2018-07-02 2020-01-02 Daimler Ag Method and assistance system for operating an autonomous driving operation and vehicle
CN112533809A (en) * 2018-08-06 2021-03-19 日产自动车株式会社 Vehicle control method and vehicle control device
JP6628843B1 (en) * 2018-09-05 2020-01-15 三菱電機株式会社 Obstacle avoidance device and obstacle avoidance route generation device
DE102018216364B4 (en) * 2018-09-25 2020-07-09 Volkswagen Aktiengesellschaft Method and device for supporting a lane change process for a vehicle
JP7229710B2 (en) * 2018-09-26 2023-02-28 本田技研工業株式会社 VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM
US10300851B1 (en) * 2018-10-04 2019-05-28 StradVision, Inc. Method for warning vehicle of risk of lane change and alarm device using the same
US11120277B2 (en) * 2018-10-10 2021-09-14 Denso Corporation Apparatus and method for recognizing road shapes
KR102572784B1 (en) * 2018-10-25 2023-09-01 주식회사 에이치엘클레무브 Driver assistance system and control method for the same
US11827241B2 (en) 2018-10-29 2023-11-28 Motional Ad Llc Adjusting lateral clearance for a vehicle using a multi-dimensional envelope
CN109543252B (en) * 2018-11-05 2022-11-22 中国航空工业集团公司西安飞机设计研究所 System safety evaluation method based on bird collision
CN111144179A (en) * 2018-11-06 2020-05-12 富士通株式会社 Scene detection device and method
JP7163729B2 (en) * 2018-11-08 2022-11-01 トヨタ自動車株式会社 vehicle controller
JP2020075665A (en) * 2018-11-09 2020-05-21 トヨタ自動車株式会社 Vehicle travelling control device
US10752242B2 (en) * 2018-11-19 2020-08-25 GM Global Technology Operations LLC System and method for control of an autonomous vehicle
DE102018220127A1 (en) * 2018-11-23 2020-05-28 Ford Global Technologies, Llc Lane change assistance system
JP7155975B2 (en) * 2018-12-10 2022-10-19 トヨタ自動車株式会社 vehicle controller
DE102018132464B4 (en) * 2018-12-17 2020-07-30 Bayerische Motoren Werke Aktiengesellschaft Parking assistance system for carrying out automated maneuvers of various maneuver types supported by the system with a user interface
DE102018221860A1 (en) * 2018-12-17 2020-07-02 Volkswagen Aktiengesellschaft Procedure and assistance system for preparing and / or performing a lane change
US10916134B2 (en) * 2018-12-20 2021-02-09 Denso International America, Inc. Systems and methods for responding to a vehicle parked on shoulder of the road
US10816635B1 (en) 2018-12-20 2020-10-27 Autonomous Roadway Intelligence, Llc Autonomous vehicle localization system
US10820349B2 (en) 2018-12-20 2020-10-27 Autonomous Roadway Intelligence, Llc Wireless message collision avoidance with high throughput
US11276304B2 (en) 2018-12-20 2022-03-15 Denso International America, Inc. Systems and methods for addressing a moving vehicle response to a stationary vehicle
JP7147546B2 (en) * 2018-12-25 2022-10-05 トヨタ自動車株式会社 Slip angle estimator for vehicle
JP7256982B2 (en) 2018-12-28 2023-04-13 スズキ株式会社 Vehicle travel control device
US11433917B2 (en) * 2018-12-28 2022-09-06 Continental Autonomous Mobility US, LLC System and method of human interface for recommended path
US11022458B2 (en) * 2019-01-04 2021-06-01 Telenav, Inc. Navigation system with roadway lane guidance mechanism and method of operation thereof
US11505181B2 (en) 2019-01-04 2022-11-22 Toyota Motor Engineering & Manufacturing North America, Inc. System, method, and computer-readable storage medium for vehicle collision avoidance on the highway
DK180407B1 (en) * 2019-01-28 2021-04-21 Motional Ad Llc Detecting road anomalies
JP7188212B2 (en) * 2019-03-22 2022-12-13 トヨタ自動車株式会社 Vehicle running control device
CN111835998B (en) * 2019-04-13 2023-06-13 长沙智能驾驶研究院有限公司 Beyond-the-horizon panoramic image acquisition method, device, medium, equipment and system
DE102019205892B4 (en) * 2019-04-25 2022-12-29 Volkswagen Aktiengesellschaft Method for operating a motor vehicle and motor vehicle designed to carry out such a method
KR20200130888A (en) * 2019-05-07 2020-11-23 현대모비스 주식회사 Method for controlling scc system based on complex information and apparatus for the same
JP7180536B2 (en) * 2019-05-24 2022-11-30 トヨタ自動車株式会社 vehicle
US10820182B1 (en) 2019-06-13 2020-10-27 David E. Newman Wireless protocols for emergency message transmission
US10939471B2 (en) 2019-06-13 2021-03-02 David E. Newman Managed transmission of wireless DAT messages
US10713950B1 (en) 2019-06-13 2020-07-14 Autonomous Roadway Intelligence, Llc Rapid wireless communication for vehicle collision mitigation
US11755028B2 (en) 2019-09-11 2023-09-12 Deere & Company Mobile work machine with object detection using vision recognition
US11814816B2 (en) 2019-09-11 2023-11-14 Deere & Company Mobile work machine with object detection and machine path visualization
JP7393730B2 (en) 2019-09-26 2023-12-07 スズキ株式会社 Vehicle travel control device
DE102019215026A1 (en) * 2019-09-30 2021-04-01 Robert Bosch Gmbh Method and device for determining a highly accurate estimate of a yaw rate for controlling a vehicle
US11548505B2 (en) 2019-12-09 2023-01-10 Magna Electronics Inc. Vehicular speed control system with automatic setting parameters
US11561543B2 (en) * 2019-12-11 2023-01-24 Baidu Usa Llc Speed planning using a speed planning guideline for idle speed of autonomous driving vehicles
JP7309594B2 (en) * 2019-12-18 2023-07-18 Kddi株式会社 Merging Support Information Delivery Device, Merging Support System, Merging Support Information Delivery Method, and Computer Program
CN111028491A (en) * 2019-12-27 2020-04-17 苏州欧孚网络科技股份有限公司 System and method for monitoring safe food delivery of rider
CN113138594B (en) * 2020-01-20 2024-04-19 北京四维图新科技股份有限公司 Automatic driving method and device
US11599117B2 (en) * 2020-02-20 2023-03-07 Steering Solutions Ip Holding Corporation Systems and methods for obstacle proximity detection
US11472416B2 (en) * 2020-04-30 2022-10-18 Deere & Company Multi-dimensional mobile machine path visualization and control system
CN111580522A (en) * 2020-05-15 2020-08-25 东风柳州汽车有限公司 Control method for unmanned vehicle, and storage medium
KR20210149374A (en) * 2020-06-02 2021-12-09 현대자동차주식회사 Apparatus and method for providing breaking level of forward vehicle
US11030893B1 (en) * 2020-06-05 2021-06-08 Samuel Messinger System for reducing speed of a vehicle and method thereof
US11964691B2 (en) 2020-08-17 2024-04-23 Magna Electronics Inc. Vehicular control system with autonomous braking
JP2022037421A (en) * 2020-08-25 2022-03-09 株式会社Subaru Vehicle travel control device
US11206092B1 (en) 2020-11-13 2021-12-21 Ultralogic 5G, Llc Artificial intelligence for predicting 5G network performance
EP4001039A1 (en) * 2020-11-17 2022-05-25 Toyota Jidosha Kabushiki Kaisha Vehicle adaptive cruise control system and method; computer program and computer readable medium for implementing the method
US20220179410A1 (en) * 2020-12-04 2022-06-09 Ford Global Technologies, Llc Systems And Methods For Eliminating Vehicle Motion Interference During A Remote-Control Vehicle Maneuvering Operation
US11212831B1 (en) 2020-12-04 2021-12-28 Ultralogic 5G, Llc Rapid uplink access by modulation of 5G scheduling requests
US11760379B2 (en) * 2021-01-11 2023-09-19 Toyota Research Institute, Inc. Navigating an autonomous vehicle through an intersection
US11904906B2 (en) * 2021-08-05 2024-02-20 Argo AI, LLC Systems and methods for prediction of a jaywalker trajectory through an intersection
KR20230055722A (en) * 2021-10-19 2023-04-26 현대모비스 주식회사 A target detection system and method of a vehicle
US11987237B2 (en) * 2021-12-20 2024-05-21 Waymo Llc Systems and methods to determine a lane change strategy at a merge region

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070030131A1 (en) * 2005-08-02 2007-02-08 Nissan Motor Co., Ltd. Vehicle obstacle verification system
US20080162027A1 (en) * 2006-12-29 2008-07-03 Robotic Research, Llc Robotic driving system
US20130151058A1 (en) * 2011-12-09 2013-06-13 GM Global Technology Operations LLC Method and system for controlling a host vehicle
US20140018995A1 (en) * 2012-04-27 2014-01-16 Google Inc. Safely Navigating on Roads Through Maintaining Safe Distance from Other Vehicles

Family Cites Families (316)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5170374A (en) 1981-05-13 1992-12-08 Hitachi, Ltd. Semiconductor memory
US6735506B2 (en) 1992-05-05 2004-05-11 Automotive Technologies International, Inc. Telematics system
US5845000A (en) 1992-05-05 1998-12-01 Automotive Technologies International, Inc. Optical identification and monitoring system using pattern recognition for use with vehicles
US6442465B2 (en) 1992-05-05 2002-08-27 Automotive Technologies International, Inc. Vehicular component control systems and methods
US4720790A (en) 1984-05-21 1988-01-19 Kabushiki Kaisha Toyota Chuo Kenkyusho Apparatus for controlling steer angle of rear wheels of vehicle
US5001558A (en) 1985-06-11 1991-03-19 General Motors Corporation Night vision system with color video camera
US5835255A (en) 1986-04-23 1998-11-10 Etalon, Inc. Visible spectrum modulator arrays
US5064274A (en) 1987-08-26 1991-11-12 Siegel-Robert, Inc. Automatic automobile rear view mirror assembly
JPH01173825A (en) 1987-12-28 1989-07-10 Aisin Aw Co Ltd Navigation device for vehicle
US4991054A (en) 1988-05-13 1991-02-05 Pacific Scientific Company Time-delay outdoor lighting control systems
US5003288A (en) 1988-10-25 1991-03-26 Nartron Corporation Ambient light sensing method and apparatus
US5614885A (en) 1988-12-05 1997-03-25 Prince Corporation Electrical control system for vehicle options
FR2642855B1 (en) 1989-02-06 1991-05-17 Essilor Int OPTICAL LENS FOR THE CORRECTION OF ASTIGMATISM
JPH0749925B2 (en) 1989-03-01 1995-05-31 浜松ホトニクス株式会社 Two-dimensional incident position detector
JPH02308575A (en) 1989-05-24 1990-12-21 Nissan Motor Co Ltd Photodetector cell
US5097362A (en) 1989-07-19 1992-03-17 Lynas Robert M Rearview mirror targeting and repositioning system
US5027001A (en) 1989-08-29 1991-06-25 Torbert William F Moisture sensitive automatic windshield wiper and headlight control device
US4987357A (en) 1989-12-18 1991-01-22 General Motors Corporation Adaptive motor vehicle cruise control
US5059877A (en) 1989-12-22 1991-10-22 Libbey-Owens-Ford Co. Rain responsive windshield wiper control
JP2843079B2 (en) 1989-12-22 1999-01-06 本田技研工業株式会社 Driving path determination method
US5044706A (en) 1990-02-06 1991-09-03 Hughes Aircraft Company Optical element employing aspherical and binary grating optical surfaces
FR2658642B1 (en) 1990-02-20 1994-06-10 Rousseau Codes METHOD AND DEVICE FOR DRIVING DRIVING LAND VEHICLES.
US5303205A (en) 1990-02-26 1994-04-12 Trend Tec Inc. Vehicular distance measuring system with integral mirror display
US5072154A (en) 1990-03-13 1991-12-10 Chen Min Hsiung Automatic luminosity control device for car and motor bicycle headlamps
JP2920653B2 (en) 1990-03-15 1999-07-19 アイシン精機株式会社 In-vehicle imaging device
DE4111993B4 (en) 1990-04-23 2005-05-25 Volkswagen Ag Camera for an image processing system
US5121200A (en) 1990-07-06 1992-06-09 Choi Seung Lyul Travelling monitoring system for motor vehicles
US5027200A (en) 1990-07-10 1991-06-25 Edward Petrossian Enhanced viewing at side and rear of motor vehicles
US5177685A (en) 1990-08-09 1993-01-05 Massachusetts Institute Of Technology Automobile navigation system using real time spoken driving instructions
US5086253A (en) 1990-10-15 1992-02-04 Lawler Louis N Automatic headlight dimmer apparatus
US5124549A (en) 1990-10-15 1992-06-23 Lectron Products, Inc. Automatic headlamp dimmer with optical baffle
US5446576A (en) 1990-11-26 1995-08-29 Donnelly Corporation Electrochromic mirror for vehicles with illumination and heating control
US5309137A (en) 1991-02-26 1994-05-03 Mitsubishi Denki Kabushiki Kaisha Motor car traveling control device
US5451822A (en) 1991-03-15 1995-09-19 Gentex Corporation Electronic control system
KR930001987Y1 (en) 1991-03-28 1993-04-19 홍선택 Rear-view mirror adjusting device
US5414257A (en) 1991-04-23 1995-05-09 Introlab Pty Limited Moisture sensor for detecting moisture on a windshield
US5182502A (en) 1991-05-06 1993-01-26 Lectron Products, Inc. Automatic headlamp dimmer
US5245422A (en) 1991-06-28 1993-09-14 Zexel Corporation System and method for automatically steering a vehicle within a lane in a road
JP2782990B2 (en) 1991-07-11 1998-08-06 日産自動車株式会社 Vehicle approach determination device
US5469298A (en) 1991-08-14 1995-11-21 Prince Corporation Reflective display at infinity
JPH0554276A (en) 1991-08-23 1993-03-05 Matsushita Electric Ind Co Ltd Obstacle detection device
US5193000A (en) 1991-08-28 1993-03-09 Stereographics Corporation Multiplexing technique for stereoscopic video system
US5416318A (en) 1991-10-03 1995-05-16 Hegyi; Dennis J. Combined headlamp and climate control sensor having a light diffuser and a light modulator
FR2682792B1 (en) 1991-10-16 1995-10-20 Ii Bc Sys DEVICE FOR AVOIDING CARAMBOLAGES IN CHAIN.
JP3167752B2 (en) 1991-10-22 2001-05-21 富士重工業株式会社 Vehicle distance detection device
US5535314A (en) 1991-11-04 1996-07-09 Hughes Aircraft Company Video image processor and method for detecting vehicles
JP3031013B2 (en) 1991-11-15 2000-04-10 日産自動車株式会社 Visual information providing device
US5193029A (en) 1991-11-19 1993-03-09 Donnelly Corporation Single sensor adaptive drive circuit for rearview mirror system
US5336980A (en) 1992-12-10 1994-08-09 Leopold Kostal Gmbh & Co. Apparatus and method for controlling a windshield wiping system
US5276388A (en) 1991-12-14 1994-01-04 Leopold Kostal Gmbh & Co. Kg Apparatus and method for controlling a windshield wiping system
US5644851A (en) 1991-12-20 1997-07-08 Blank; Rodney K. Compensation system for electronic compass
US5255442A (en) 1991-12-20 1993-10-26 Donnelly Corporation Vehicle compass with electronic sensor
US5394333A (en) 1991-12-23 1995-02-28 Zexel Usa Corp. Correcting GPS position in a hybrid naviation system
US5208701A (en) 1991-12-24 1993-05-04 Xerox Corporation Wobble correction lens with binary diffractive optic surface and refractive cylindrical surface
US5461357A (en) 1992-01-29 1995-10-24 Mazda Motor Corporation Obstacle detection device for vehicle
US5168378A (en) 1992-02-10 1992-12-01 Reliant Laser Corporation Mirror with dazzle light attenuation zone
JP2800531B2 (en) 1992-02-28 1998-09-21 三菱電機株式会社 Obstacle detection device for vehicles
JP2973695B2 (en) 1992-03-12 1999-11-08 船井電機株式会社 In-vehicle navigation system
JPH05265547A (en) 1992-03-23 1993-10-15 Fuji Heavy Ind Ltd On-vehicle outside monitoring device
US5204778A (en) 1992-04-06 1993-04-20 Gentex Corporation Control system for automatic rearview mirrors
US5305012A (en) 1992-04-15 1994-04-19 Reveo, Inc. Intelligent electro-optical system and method for automatic glare reduction
EP0567660B2 (en) 1992-04-21 2000-09-06 IBP Pietzsch GmbH Device for the guiding of vehicles
US5325386A (en) 1992-04-21 1994-06-28 Bandgap Technology Corporation Vertical-cavity surface emitting laser assay display system
GB2267341B (en) 1992-05-27 1996-02-21 Koito Mfg Co Ltd Glare sensor for a vehicle
US5515448A (en) 1992-07-28 1996-05-07 Yazaki Corporation Distance measuring apparatus of a target tracking type
JPH0785280B2 (en) 1992-08-04 1995-09-13 タカタ株式会社 Collision prediction judgment system by neural network
US5351044A (en) 1992-08-12 1994-09-27 Rockwell International Corporation Vehicle lane position detection system
KR100267026B1 (en) 1992-08-14 2000-09-15 존 리 올즈 Recording of operational events in an automotive vehicle
BR9306886A (en) 1992-08-14 1998-12-08 Vorad Safety Systems Inc System to detect obstacles to vehicles
JP2783079B2 (en) 1992-08-28 1998-08-06 トヨタ自動車株式会社 Light distribution control device for headlamp
US5448319A (en) 1992-09-22 1995-09-05 Olympus Optical Co., Ltd. Optical system for monitor cameras to be mounted on vehicles
DE4332612C2 (en) 1992-09-25 1996-02-22 Yazaki Corp Exterior view monitoring method for motor vehicles
JP3462227B2 (en) 1992-11-13 2003-11-05 矢崎総業株式会社 Display device for vehicles
EP0631167B1 (en) 1992-12-14 2005-02-16 Denso Corporation Image display
US5285060A (en) 1992-12-15 1994-02-08 Donnelly Corporation Display for automatic rearview mirror
US5497306A (en) 1993-02-01 1996-03-05 Donnelly Corporation Exterior vehicle security light
JP3263699B2 (en) 1992-12-22 2002-03-04 三菱電機株式会社 Driving environment monitoring device
EP0605045B1 (en) 1992-12-29 1999-03-31 Laboratoires D'electronique Philips S.A.S. Image processing method and apparatus for generating one image from adjacent images
JPH06213660A (en) 1993-01-19 1994-08-05 Aisin Seiki Co Ltd Detecting method for approximate straight line of image
US5529138A (en) 1993-01-22 1996-06-25 Shaw; David C. H. Vehicle collision avoidance system
US5289321A (en) 1993-02-12 1994-02-22 Secor James O Consolidated rear view camera and display system for motor vehicle
US5313072A (en) 1993-02-16 1994-05-17 Rockwell International Corporation Optical detector for windshield wiper control
US5877897A (en) 1993-02-26 1999-03-02 Donnelly Corporation Automatic rearview mirror, vehicle lighting control and vehicle interior monitoring system using a photosensor array
US5670935A (en) 1993-02-26 1997-09-23 Donnelly Corporation Rearview vision system for vehicle including panoramic view
US6498620B2 (en) 1993-02-26 2002-12-24 Donnelly Corporation Vision system for a vehicle including an image capture device and a display system having a long focal length
US6396397B1 (en) 1993-02-26 2002-05-28 Donnelly Corporation Vehicle imaging system with stereo imaging
US5550677A (en) 1993-02-26 1996-08-27 Donnelly Corporation Automatic rearview mirror system using a photosensor array
US5796094A (en) 1993-02-26 1998-08-18 Donnelly Corporation Vehicle headlight control using imaging sensor
US6674562B1 (en) 1994-05-05 2004-01-06 Iridigm Display Corporation Interferometric modulation of radiation
JP3468428B2 (en) 1993-03-24 2003-11-17 富士重工業株式会社 Vehicle distance detection device
JP2887039B2 (en) 1993-03-26 1999-04-26 三菱電機株式会社 Vehicle periphery monitoring device
DE4408745C2 (en) 1993-03-26 1997-02-27 Honda Motor Co Ltd Driving control device for vehicles
WO1994022693A1 (en) 1993-03-31 1994-10-13 Automotive Technologies International, Inc. Vehicle occupant position and velocity sensor
US6430303B1 (en) 1993-03-31 2002-08-06 Fujitsu Limited Image processing apparatus
DE4313568C1 (en) * 1993-04-26 1994-06-16 Daimler Benz Ag Guiding motor vehicle driver when changing traffic lanes - using radar devices to detect velocity and spacing of vehicles in next lane and indicate when lane changing is possible
US6084519A (en) 1993-05-07 2000-07-04 Control Devices, Inc. Multi-function light sensor for vehicle
DE4318114C2 (en) 1993-06-01 1998-07-16 Kostal Leopold Gmbh & Co Kg Sensor device
US6553130B1 (en) 1993-08-11 2003-04-22 Jerome H. Lemelson Motor vehicle warning and control system and method
US5434407A (en) 1993-08-23 1995-07-18 Gentex Corporation Automatic rearview mirror incorporating light pipe
GB9317983D0 (en) 1993-08-28 1993-10-13 Lucas Ind Plc A driver assistance system for a vehicle
US5586063A (en) 1993-09-01 1996-12-17 Hardin; Larry C. Optical range and speed detection system
US5638116A (en) 1993-09-08 1997-06-10 Sumitomo Electric Industries, Ltd. Object recognition apparatus and method
US5457493A (en) 1993-09-15 1995-10-10 Texas Instruments Incorporated Digital micro-mirror based image simulation system
US5374852A (en) 1993-09-17 1994-12-20 Parkes; Walter B. Motor vehicle headlight activation apparatus for inclement weather conditions
US5440428A (en) 1993-09-30 1995-08-08 Hughes Aircraft Company Automotive instrument 3-D virtual image display
US5883739A (en) 1993-10-04 1999-03-16 Honda Giken Kogyo Kabushiki Kaisha Information display device for vehicle
US5406395A (en) 1993-11-01 1995-04-11 Hughes Aircraft Company Holographic parking assistance device
JP3522317B2 (en) 1993-12-27 2004-04-26 富士重工業株式会社 Travel guide device for vehicles
US5430431A (en) 1994-01-19 1995-07-04 Nelson; Louis J. Vehicle protection system and method
US5471515A (en) 1994-01-28 1995-11-28 California Institute Of Technology Active pixel sensor with intra-pixel charge transfer
US5461361A (en) 1994-03-11 1995-10-24 Chrysler Corporation Automotive instrument panel apparatus
JP3358099B2 (en) 1994-03-25 2002-12-16 オムロン株式会社 Optical sensor device
US5619370A (en) 1994-03-28 1997-04-08 Guinosso; Patrick J. Optical system for viewing a remote location
US5666028A (en) 1994-04-06 1997-09-09 Gentex Corporation Automobile headlamp and running light control system
US5537003A (en) 1994-04-08 1996-07-16 Gentex Corporation Control system for automotive vehicle headlamps and other vehicle equipment
FR2718874B1 (en) 1994-04-15 1996-05-15 Thomson Csf Traffic monitoring method for automatic detection of vehicle incidents.
US6680792B2 (en) 1994-05-05 2004-01-20 Iridigm Display Corporation Interferometric modulation of radiation
US7123216B1 (en) 1994-05-05 2006-10-17 Idc, Llc Photonic MEMS and structures
US6710908B2 (en) 1994-05-05 2004-03-23 Iridigm Display Corporation Controlling micro-electro-mechanical cavities
US5963247A (en) 1994-05-31 1999-10-05 Banitt; Shmuel Visual display systems and a system for producing recordings for visualization thereon and methods therefor
ES1028357Y (en) 1994-06-03 1995-06-16 Cortes Luis Leon Lamata RECEIVING DEVICE FOR REAR VIEW SCREEN.
US5574443A (en) 1994-06-22 1996-11-12 Hsieh; Chi-Sheng Vehicle monitoring apparatus with broadly and reliably rearward viewing
JP3287117B2 (en) 1994-07-05 2002-05-27 株式会社日立製作所 Environment recognition device for vehicles using imaging device
JP3357749B2 (en) 1994-07-12 2002-12-16 本田技研工業株式会社 Vehicle road image processing device
US5594222A (en) 1994-10-25 1997-01-14 Integrated Controls Touch sensor and control circuit therefor
US5793420A (en) 1994-10-28 1998-08-11 Schmidt; William P. Video recording system for vehicle
US5732379A (en) 1994-11-25 1998-03-24 Itt Automotive Europe Gmbh Brake system for a motor vehicle with yaw moment control
US5677851A (en) 1994-12-15 1997-10-14 Novell, Inc. Method and apparatus to secure digital directory object changes
JPH08175263A (en) 1994-12-27 1996-07-09 Murakami Kaimeidou:Kk Interior mirror with built-in display device
US5614788A (en) 1995-01-31 1997-03-25 Autosmart Light Switches, Inc. Automated ambient condition responsive daytime running light system
US5528698A (en) 1995-03-27 1996-06-18 Rockwell International Corporation Automotive occupant sensing device
JP2885125B2 (en) 1995-03-30 1999-04-19 トヨタ自動車株式会社 Estimation method of motion state quantity changing with turning of vehicle
JP3539788B2 (en) 1995-04-21 2004-07-07 パナソニック モバイルコミュニケーションズ株式会社 Image matching method
US5500766A (en) 1995-05-04 1996-03-19 Stonecypher; Bob Blind spot side mirror
US5568027A (en) 1995-05-19 1996-10-22 Libbey-Owens-Ford Co. Smooth rain-responsive wiper control
US5737226A (en) 1995-06-05 1998-04-07 Prince Corporation Vehicle compass system with automatic calibration
US7085637B2 (en) 1997-10-22 2006-08-01 Intelligent Technologies International, Inc. Method and system for controlling a vehicle
US7202776B2 (en) 1997-10-22 2007-04-10 Intelligent Technologies International, Inc. Method and system for detecting objects external to a vehicle
US5915800A (en) 1995-06-19 1999-06-29 Fuji Jukogyo Kabushiki Kaisha System for controlling braking of an automotive vehicle
JP3546600B2 (en) 1995-09-07 2004-07-28 トヨタ自動車株式会社 Light distribution control device for headlamp
US5724316A (en) 1995-09-26 1998-03-03 Delco Electronics Corporation GPS based time determining system and method
US5878370A (en) 1995-12-01 1999-03-02 Prince Corporation Vehicle compass system with variable resolution
US5790973A (en) 1995-12-19 1998-08-04 Prince Corporation Last exit warning system
DE69632384T2 (en) 1995-12-19 2005-05-04 Canon K.K. Apparatus and method for controlling a plurality of remote cameras
US5761094A (en) 1996-01-18 1998-06-02 Prince Corporation Vehicle compass system
US5786772A (en) 1996-03-22 1998-07-28 Donnelly Corporation Vehicle blind spot detection display system
WO1997038350A1 (en) 1996-04-10 1997-10-16 Donnelly Corporation Electrochromic devices
US5661303A (en) 1996-05-24 1997-08-26 Libbey-Owens-Ford Co. Compact moisture sensor with collimator lenses and prismatic coupler
US6550949B1 (en) 1996-06-13 2003-04-22 Gentex Corporation Systems and components for enhancing rear vision from a vehicle
DE19624046A1 (en) 1996-06-17 1997-12-18 Bayerische Motoren Werke Ag Method and device for indicating the braking strength or deceleration in a vehicle
JP3805832B2 (en) 1996-07-10 2006-08-09 富士重工業株式会社 Vehicle driving support device
JPH1059068A (en) 1996-08-23 1998-03-03 Yoshihisa Furuta Dead angle confirmation device for vehicle
US5878357A (en) 1996-09-03 1999-03-02 Ford Global Technologies, Inc. Method and apparatus for vehicle yaw rate estimation
US5924212A (en) 1996-10-09 1999-07-20 Donnelly Corporation Electronic compass
JPH10161013A (en) 1996-12-05 1998-06-19 Canon Inc Environment recognition device and camera provided therewith
JP4162717B2 (en) 1996-12-10 2008-10-08 タッチ センサー テクノロジーズ,エルエルシー Differential touch sensor and control circuit thereof
US5877707A (en) 1997-01-17 1999-03-02 Kowalick; Thomas M. GPS based seat belt monitoring system & method for using same
US5844505A (en) 1997-04-01 1998-12-01 Sony Corporation Automobile navigation system
US6587573B1 (en) 2000-03-20 2003-07-01 Gentex Corporation System for controlling exterior vehicle lights
US6631316B2 (en) 2001-03-05 2003-10-07 Gentex Corporation Image processing system to control vehicle headlamps or other vehicle equipment
US6611610B1 (en) 1997-04-02 2003-08-26 Gentex Corporation Vehicle lamp control
US6049171A (en) 1998-09-18 2000-04-11 Gentex Corporation Continuously variable headlamp control
US5837994C1 (en) 1997-04-02 2001-10-16 Gentex Corp Control system to automatically dim vehicle head lamps
US5923027A (en) 1997-09-16 1999-07-13 Gentex Corporation Moisture sensor and windshield fog detector using an image sensor
US5990469A (en) 1997-04-02 1999-11-23 Gentex Corporation Control circuit for image array sensors
JP3508909B2 (en) 1997-07-01 2004-03-22 株式会社村上開明堂 Rearview mirror quick deflection controller
US6250148B1 (en) 1998-01-07 2001-06-26 Donnelly Corporation Rain sensor mount for use in a vehicle
US6313454B1 (en) 1999-07-02 2001-11-06 Donnelly Corporation Rain sensor
EP1025702B9 (en) 1997-10-30 2007-10-03 Donnelly Corporation Rain sensor with fog discrimination
US6020704A (en) 1997-12-02 2000-02-01 Valeo Electrical Systems, Inc. Windscreen sensing and wiper control system
US6124647A (en) 1998-12-16 2000-09-26 Donnelly Corporation Information display in a rearview mirror
US6294989B1 (en) 1998-12-16 2001-09-25 Donnelly Corporation Tire inflation assistance monitoring system
DE19812237C1 (en) 1998-03-20 1999-09-23 Daimler Chrysler Ag Method for driving dynamics control on a road vehicle
US5899956A (en) 1998-03-31 1999-05-04 Advanced Future Technologies, Inc. Vehicle mounted navigation device
US6477464B2 (en) 2000-03-09 2002-11-05 Donnelly Corporation Complete mirror-based global-positioning system (GPS) navigation solution
US6158655A (en) 1998-04-08 2000-12-12 Donnelly Corporation Vehicle mounted remote transaction interface system
US6420975B1 (en) 1999-08-25 2002-07-16 Donnelly Corporation Interior rearview mirror sound processing system
JPH11331822A (en) 1998-05-15 1999-11-30 Matsushita Electric Ind Co Ltd Monitor camera system
US6175300B1 (en) 1998-09-03 2001-01-16 Byron K. Kendrick Blind spot viewing system
US6066933A (en) 1998-10-02 2000-05-23 Ponziana; Richard L. Rain sensing system and method having automatically registered and oriented rain sensor
US6266442B1 (en) 1998-10-23 2001-07-24 Facet Technology Corp. Method and apparatus for identifying objects depicted in a videostream
US6201642B1 (en) 1999-07-27 2001-03-13 Donnelly Corporation Vehicular vision system with a wide angle lens including a diffractive element
US6717610B1 (en) 1998-11-25 2004-04-06 Donnelly Corporation Wide angle image capture system for vehicle
US6320282B1 (en) 1999-01-19 2001-11-20 Touchsensor Technologies, Llc Touch switch with integral control circuit
DE19902081A1 (en) 1999-01-20 2000-07-27 Zeiss Carl Fa Stabilized camera
US6166698A (en) 1999-02-16 2000-12-26 Gentex Corporation Rearview mirror with integrated microwave receiver
US6144022A (en) 1999-03-15 2000-11-07 Valeo Electrical Systems, Inc. Rain sensor using statistical analysis
US6333759B1 (en) 1999-03-16 2001-12-25 Joseph J. Mazzilli 360 ° automobile video camera system
US6392315B1 (en) 1999-04-05 2002-05-21 Delphi Technologies, Inc. Compensation circuit for an automotive ignition sensing system
CN100438623C (en) 1999-04-16 2008-11-26 松下电器产业株式会社 Image processing device and monitoring system
US6560527B1 (en) * 1999-10-18 2003-05-06 Ford Global Technologies, Inc. Speed control method
US6757109B2 (en) 1999-07-27 2004-06-29 Donnelly Corporation Plastic lens system for vehicle imaging system
US6515781B2 (en) 1999-08-05 2003-02-04 Microvision, Inc. Scanned imaging apparatus with switched feeds
US6795221B1 (en) 1999-08-05 2004-09-21 Microvision, Inc. Scanned display with switched feeds and distortion correction
US6433907B1 (en) 1999-08-05 2002-08-13 Microvision, Inc. Scanned display with plurality of scanning assemblies
US6227689B1 (en) 1999-09-28 2001-05-08 Donnelly Corporation Illumination device for exterior mirror
US6411204B1 (en) 1999-11-15 2002-06-25 Donnelly Corporation Deceleration based anti-collision safety light control for vehicle
US6704621B1 (en) 1999-11-26 2004-03-09 Gideon P. Stein System and method for estimating ego-motion of a moving vehicle using successive images recorded along the vehicle's path of motion
SE520360C2 (en) 1999-12-15 2003-07-01 Goeran Sjoenell Warning device for vehicles
US6526335B1 (en) 2000-01-24 2003-02-25 G. Victor Treyz Automobile personal computer systems
JP2001213254A (en) 2000-01-31 2001-08-07 Yazaki Corp Side monitoring device for vehicle
WO2007053710A2 (en) 2005-11-01 2007-05-10 Donnelly Corporation Interior rearview mirror with display
US7167796B2 (en) 2000-03-09 2007-01-23 Donnelly Corporation Vehicle navigation system for use with a telematics system
US7480149B2 (en) 2004-08-18 2009-01-20 Donnelly Corporation Accessory module for vehicle
WO2001064481A2 (en) 2000-03-02 2001-09-07 Donnelly Corporation Video mirror systems incorporating an accessory module
US7370983B2 (en) 2000-03-02 2008-05-13 Donnelly Corporation Interior mirror assembly with display
US7224324B2 (en) 2000-03-27 2007-05-29 Donnelly Corporation Interactive automotive rearvision system
KR100373002B1 (en) 2000-04-03 2003-02-25 현대자동차주식회사 Method for judgment out of lane of vehicle
US7365769B1 (en) 2000-07-06 2008-04-29 Donald Mager Activating a vehicle's own brake lights and/or brakes when brake lights are sensed in front of the vehicle, including responsively to the proximity of, and/or rate of closure with, a forward vehicle
GB2365142B (en) 2000-07-27 2002-06-19 Michael John Downs Jamin-type interferometers and components therefor
JP3521860B2 (en) 2000-10-02 2004-04-26 日産自動車株式会社 Vehicle travel path recognition device
US7062300B1 (en) 2000-11-09 2006-06-13 Ki Il Kim Cellular phone holder with charger mounted to vehicle dashboard
US6672731B2 (en) 2000-11-20 2004-01-06 Donnelly Corporation Vehicular rearview mirror with blind spot viewing system
AU2002251807A1 (en) 2001-01-23 2002-08-19 Donnelly Corporation Improved vehicular lighting system for a mirror assembly
US7581859B2 (en) 2005-09-14 2009-09-01 Donnelly Corp. Display device for exterior rearview mirror
US6890041B1 (en) * 2001-02-06 2005-05-10 William B. Ribbens Antilock brake systems employing a sliding mode observer based estimation of differential wheel torque
US20020113873A1 (en) 2001-02-20 2002-08-22 Williams Michael R. Rear vision system for large vehicles
JP4140202B2 (en) 2001-02-28 2008-08-27 三菱電機株式会社 Moving object detection device
US6424273B1 (en) 2001-03-30 2002-07-23 Koninklijke Philips Electronics N.V. System to aid a driver to determine whether to change lanes
DE10118265A1 (en) 2001-04-12 2002-10-17 Bosch Gmbh Robert Detecting vehicle lane change, involves forming track change indicating signal by comparing measured angular rate of preceding vehicle(s) with vehicle's own yaw rate
DE20106977U1 (en) 2001-04-23 2002-08-29 Mekra Lang Gmbh & Co Kg Warning device in motor vehicles
US6539306B2 (en) 2001-06-15 2003-03-25 Gentex Corporation Automotive mirror with integrated Loran components
US6497503B1 (en) 2001-06-21 2002-12-24 Ford Global Technologies, Inc. Headlamp system with selectable beam pattern
US6882287B2 (en) 2001-07-31 2005-04-19 Donnelly Corporation Automotive lane change aid
US6589625B1 (en) 2001-08-01 2003-07-08 Iridigm Display Corporation Hermetic seal and method to create the same
WO2003029046A1 (en) 2001-10-03 2003-04-10 Maryann Winter Apparatus and method for sensing the occupancy status of parking spaces in a parking lot
US6636258B2 (en) 2001-10-19 2003-10-21 Ford Global Technologies, Llc 360° vision system for a vehicle
US6909753B2 (en) 2001-12-05 2005-06-21 Koninklijke Philips Electronics, N.V. Combined MPEG-4 FGS and modulation algorithm for wireless video transmission
US20030137586A1 (en) 2002-01-22 2003-07-24 Infinite Innovations, Inc. Vehicle video switching system and method
WO2003065084A1 (en) 2002-01-31 2003-08-07 Donnelly Corporation Vehicle accessory module
EP1332923B1 (en) 2002-02-05 2007-07-11 Donnelly Hohe GmbH & Co. KG Manoeuvring and/or parking aid device for a vehicle
US6794119B2 (en) 2002-02-12 2004-09-21 Iridigm Display Corporation Method for fabricating a structure for a microelectromechanical systems (MEMS) device
US6574033B1 (en) 2002-02-27 2003-06-03 Iridigm Display Corporation Microelectromechanical systems device and method for fabricating same
US6975775B2 (en) 2002-03-06 2005-12-13 Radiant Imaging, Inc. Stray light correction method for imaging light and color measurement system
US20030222982A1 (en) 2002-03-28 2003-12-04 Hamdan Majil M. Integrated video/data information system and method for application to commercial vehicles to enhance driver awareness
US7145519B2 (en) 2002-04-18 2006-12-05 Nissan Motor Co., Ltd. Image display apparatus, method, and program for automotive vehicle
US7005974B2 (en) 2002-04-19 2006-02-28 Donnelly Corporation Vehicle imaging system
US7004606B2 (en) 2002-04-23 2006-02-28 Donnelly Corporation Automatic headlamp control
US7123168B2 (en) 2002-04-25 2006-10-17 Donnelly Corporation Driving separation distance indicator
US6946978B2 (en) 2002-04-25 2005-09-20 Donnelly Corporation Imaging system for vehicle
ES2391556T3 (en) 2002-05-03 2012-11-27 Donnelly Corporation Object detection system for vehicles
US20060061008A1 (en) 2004-09-14 2006-03-23 Lee Karner Mounting assembly for vehicle interior mirror
US6741377B2 (en) 2002-07-02 2004-05-25 Iridigm Display Corporation Device having a light-absorbing mask and a method for fabricating same
US7360932B2 (en) 2004-06-01 2008-04-22 Donnelly Corporation Mirror assembly for vehicle
DE20214892U1 (en) 2002-09-25 2002-11-21 Hohe Gmbh & Co Kg Monitoring device for a motor vehicle
US7272482B2 (en) * 2002-09-30 2007-09-18 Nissan Motor Co., Ltd. Preceding-vehicle following control system
WO2004047421A2 (en) 2002-11-14 2004-06-03 Donnelly Corporation Imaging system for vehicle
US7136753B2 (en) 2002-12-05 2006-11-14 Denso Corporation Object recognition apparatus for vehicle, inter-vehicle control apparatus, and distance measurement apparatus
US7541743B2 (en) 2002-12-13 2009-06-02 Ford Global Technologies, Llc Adaptive vehicle communication controlled lighting system
JP3925474B2 (en) * 2003-07-18 2007-06-06 日産自動車株式会社 Lane change support device
US7249860B2 (en) 2003-09-05 2007-07-31 Donnelly Corporation Interior rearview mirror assembly
DE10346508B4 (en) 2003-10-02 2007-10-11 Daimlerchrysler Ag Device for improving the visibility in a motor vehicle
KR20060120061A (en) 2003-10-28 2006-11-24 콘티넨탈 테베스 아게 운트 코. 오하게 Method and system for improving the handling characteristics of a vehicle
US7338177B2 (en) 2003-11-26 2008-03-04 Donnelly Corporation Mirror reflective element for a vehicle
US7526103B2 (en) 2004-04-15 2009-04-28 Donnelly Corporation Imaging system for vehicle
US7227611B2 (en) 2004-08-23 2007-06-05 The Boeing Company Adaptive and interactive scene illumination
US7881496B2 (en) 2004-09-30 2011-02-01 Donnelly Corporation Vision system for vehicle
US20060103727A1 (en) 2004-11-17 2006-05-18 Huan-Chin Tseng Vehicle back up camera
US8256821B2 (en) 2004-12-15 2012-09-04 Magna Donnelly Engineering Gmbh Accessory module system for a vehicle window
US7720580B2 (en) 2004-12-23 2010-05-18 Donnelly Corporation Object detection system for vehicle
US20060164221A1 (en) 2005-01-18 2006-07-27 Jensen John M Sensor-activated controlled safety or warning light mounted on or facing toward rear of vehicle
DE502006007631D1 (en) 2005-02-22 2010-09-23 Adc Automotive Dist Control METHOD FOR DETECTING THE ACTIVATION OF BRAKE LIGHTS OF PREVAILING VEHICLES
US20060250501A1 (en) 2005-05-06 2006-11-09 Widmann Glenn R Vehicle security monitor system and method
JP2006341839A (en) 2005-05-10 2006-12-21 Aisin Seiki Co Ltd Annunciating device for vehicle and warning device for vehicle
JP2006341641A (en) 2005-06-07 2006-12-21 Nissan Motor Co Ltd Image display apparatus and image display method
JP4580288B2 (en) 2005-06-28 2010-11-10 本田技研工業株式会社 Driving assistance car
WO2008051910A2 (en) 2006-10-24 2008-05-02 Donnelly Corporation Display device for exterior mirror
ES2401523T3 (en) 2005-07-06 2013-04-22 Donnelly Corporation Exterior mirror assembly for vehicle equipped with a blind spot indicator
US7460951B2 (en) 2005-09-26 2008-12-02 Gm Global Technology Operations, Inc. System and method of target tracking using sensor fusion
EP1946540A4 (en) 2005-10-28 2009-09-09 Magna Electronics Inc Camera module for vehicle vision system
JP2007129525A (en) 2005-11-04 2007-05-24 Konica Minolta Photo Imaging Inc Camera system and controller
US7444241B2 (en) * 2005-12-09 2008-10-28 Gm Global Technology Operations, Inc. Method for detecting or predicting vehicle cut-ins
WO2007074591A1 (en) 2005-12-27 2007-07-05 Honda Motor Co., Ltd. Vehicle and steering control device for vehicle
JP4462231B2 (en) 2006-05-09 2010-05-12 株式会社デンソー Auto light device for vehicle
US7724962B2 (en) 2006-07-07 2010-05-25 Siemens Corporation Context adaptive approach in vehicle detection under various visibility conditions
US7777611B2 (en) 2006-11-06 2010-08-17 Donnelly Corporation Display device for exterior rearview mirror
EP3624086A1 (en) 2007-01-25 2020-03-18 Magna Electronics Inc. Radar sensing system for vehicle
US7914187B2 (en) 2007-07-12 2011-03-29 Magna Electronics Inc. Automatic lighting system with adaptive alignment function
JP4497231B2 (en) 2007-10-09 2010-07-07 株式会社デンソー Vehicle speed control device
TWI372564B (en) 2007-10-30 2012-09-11 Av Tech Corp Video system, image emission apparatus, video receiver apparatus and control method
US8027029B2 (en) 2007-11-07 2011-09-27 Magna Electronics Inc. Object detection and tracking system
US8126643B2 (en) 2007-12-28 2012-02-28 Core Wireless Licensing S.A.R.L. Method, apparatus and computer program product for providing instructions to a destination that is revealed upon arrival
DE102008003194A1 (en) 2008-01-04 2009-07-09 Wabco Gmbh Driver assistance system
US8154418B2 (en) 2008-03-31 2012-04-10 Magna Mirrors Of America, Inc. Interior rearview mirror system
US20090265069A1 (en) 2008-04-17 2009-10-22 Herman Desbrunes Land vehicle braking system
US20100020170A1 (en) 2008-07-24 2010-01-28 Higgins-Luthman Michael J Vehicle Imaging System
US9487144B2 (en) 2008-10-16 2016-11-08 Magna Mirrors Of America, Inc. Interior mirror assembly with display
US9126525B2 (en) * 2009-02-27 2015-09-08 Magna Electronics Inc. Alert system for vehicle
WO2010144900A1 (en) 2009-06-12 2010-12-16 Magna Electronics Inc. Scalable integrated electronic control unit for vehicle
US9041806B2 (en) 2009-09-01 2015-05-26 Magna Electronics Inc. Imaging and display system for vehicle
US8633810B2 (en) 2009-11-19 2014-01-21 Robert Bosch Gmbh Rear-view multi-functional camera system
US20110157322A1 (en) 2009-12-31 2011-06-30 Broadcom Corporation Controlling a pixel array to support an adaptable light manipulator
JP2011186351A (en) 2010-03-11 2011-09-22 Sony Corp Information processor, information processing method, and program
EP2423063B1 (en) 2010-08-23 2013-03-06 Harman Becker Automotive Systems GmbH Method of detecting the braking of a vehicle
US20120303222A1 (en) 2011-03-23 2012-11-29 Tk Holding Inc. Driver assistance system
WO2012129424A2 (en) 2011-03-23 2012-09-27 Tk Holdings Inc. Driver assistance system
US9194943B2 (en) 2011-04-12 2015-11-24 Magna Electronics Inc. Step filter for estimating distance in a time-of-flight ranging system
US10793067B2 (en) 2011-07-26 2020-10-06 Magna Electronics Inc. Imaging system for vehicle
WO2013043661A1 (en) 2011-09-21 2013-03-28 Magna Electronics, Inc. Vehicle vision system using image data transmission and power supply via a coaxial cable
US9146898B2 (en) 2011-10-27 2015-09-29 Magna Electronics Inc. Driver assist system with algorithm switching
DE102011118149A1 (en) 2011-11-10 2013-05-16 Gm Global Technology Operations, Llc Method for operating a safety system of a motor vehicle and safety system for a motor vehicle
DE102011118157A1 (en) 2011-11-10 2013-05-16 GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) Method for operating an information and entertainment system of a motor vehicle and information and entertainment system
JP5499011B2 (en) 2011-11-17 2014-05-21 富士重工業株式会社 Outside environment recognition device and outside environment recognition method
US9264673B2 (en) 2011-11-20 2016-02-16 Magna Electronics, Inc. Vehicle vision system with enhanced functionality
US10099614B2 (en) 2011-11-28 2018-10-16 Magna Electronics Inc. Vision system for vehicle
US10457209B2 (en) 2012-02-22 2019-10-29 Magna Electronics Inc. Vehicle vision system with multi-paned view
US8694224B2 (en) 2012-03-01 2014-04-08 Magna Electronics Inc. Vehicle yaw rate correction
US10609335B2 (en) 2012-03-23 2020-03-31 Magna Electronics Inc. Vehicle vision system with accelerated object confirmation
US10089537B2 (en) 2012-05-18 2018-10-02 Magna Electronics Inc. Vehicle vision system with front and rear camera integration
DE102013217430A1 (en) 2012-09-04 2014-03-06 Magna Electronics, Inc. Driver assistance system for a motor vehicle
US8473144B1 (en) * 2012-10-30 2013-06-25 Google Inc. Controlling vehicle lateral lane positioning
US9090234B2 (en) 2012-11-19 2015-07-28 Magna Electronics Inc. Braking control system for vehicle
US20140218529A1 (en) 2013-02-04 2014-08-07 Magna Electronics Inc. Vehicle data recording system
US9092986B2 (en) 2013-02-04 2015-07-28 Magna Electronics Inc. Vehicular vision system
US9260095B2 (en) 2013-06-19 2016-02-16 Magna Electronics Inc. Vehicle vision system with collision mitigation
US9988047B2 (en) 2013-12-12 2018-06-05 Magna Electronics Inc. Vehicle control system with traffic driving control

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070030131A1 (en) * 2005-08-02 2007-02-08 Nissan Motor Co., Ltd. Vehicle obstacle verification system
US20080162027A1 (en) * 2006-12-29 2008-07-03 Robotic Research, Llc Robotic driving system
US20130151058A1 (en) * 2011-12-09 2013-06-13 GM Global Technology Operations LLC Method and system for controlling a host vehicle
US20140018995A1 (en) * 2012-04-27 2014-01-16 Google Inc. Safely Navigating on Roads Through Maintaining Safe Distance from Other Vehicles

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10974649B2 (en) * 2017-10-02 2021-04-13 Magna Electronics Inc. Vehicular parking assist system using backup camera
US20210341927A1 (en) * 2018-11-09 2021-11-04 Waymo Llc Verifying Predicted Trajectories Using A Grid-Based Approach
CN113110069A (en) * 2021-05-24 2021-07-13 武汉大学 Iterative neural network robust control method based on magnetic suspension planar motor
WO2023172406A1 (en) * 2022-03-08 2023-09-14 Georgia Tech Research Corporation Systems and methods for vehicle signaling and bargaining

Also Published As

Publication number Publication date
US9988047B2 (en) 2018-06-05
US20150166062A1 (en) 2015-06-18
US10688993B2 (en) 2020-06-23
US20180273033A1 (en) 2018-09-27

Similar Documents

Publication Publication Date Title
US20200324764A1 (en) Vehicular control system with pedestrian avoidance
EP3216667B1 (en) Control system for vehicle
US11008016B2 (en) Display system, display method, and storage medium
US11884277B2 (en) Method for producing a model of the surroundings of a vehicle
CN112106348B (en) Passive infrared pedestrian detection and avoidance system
CN110356402B (en) Vehicle control device, vehicle control method, and storage medium
JP6460008B2 (en) Automatic driving device
US9507345B2 (en) Vehicle control system and method
JP7416176B2 (en) display device
EP2746137B1 (en) Method and system for assisting a driver
EP3539838A1 (en) Vehicle control device
US10099692B2 (en) Control system for vehicle
US20170293299A1 (en) Vehicle automated driving system
JP2019156174A (en) Vehicle control device, vehicle, vehicle control method, and program
US20180004205A1 (en) Control device of vehicle
US20090088966A1 (en) Driving support system
JP7099970B2 (en) Vehicle control device
US20180237008A1 (en) Control device for vehicle
EP3549838A1 (en) Vehicle control device
JP2011511729A (en) Driving support system control method and driving support system
JP2020163900A (en) Vehicle control device, vehicle control method, and program
JP2018086874A (en) Following-start control apparatus of vehicle
US11136026B2 (en) Vehicle control device
CN114194186A (en) Vehicle travel control device
US20220203985A1 (en) Vehicle control device, vehicle control method, and storage medium

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCV Information on status: appeal procedure

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS