US11715373B2 - Traffic light visibility detection and augmented display - Google Patents
Traffic light visibility detection and augmented display Download PDFInfo
- Publication number
- US11715373B2 US11715373B2 US17/456,723 US202117456723A US11715373B2 US 11715373 B2 US11715373 B2 US 11715373B2 US 202117456723 A US202117456723 A US 202117456723A US 11715373 B2 US11715373 B2 US 11715373B2
- Authority
- US
- United States
- Prior art keywords
- traffic signal
- response
- vehicle
- distance
- host vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/09626—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages where the origin of the information is within the own vehicle, e.g. a local storage device, digital map
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K37/00—Dashboards
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/022—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by means of tv-camera scanning
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/052—Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/09623—Systems involving the acquisition of information from passive traffic signs by means mounted on the vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096716—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096733—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
- G08G1/096758—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where no selection takes place on the transmitted or the received information
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096783—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a roadside individual element
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/176—Camera images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/177—Augmented reality
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/178—Warnings
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/179—Distances to obstacles or vehicles
-
- B60K2370/1529—
-
- B60K2370/177—
-
- B60K2370/178—
-
- B60K2370/179—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the present disclosure relates generally to a system for providing traffic signal information to a driver within a motor vehicle. More specifically, aspects of the present disclosure relate to systems, methods and devices for determining an operator visibility of a traffic signal, controlling a vehicle in order to enhance the operator visibility and providing a traffic signal state information to a vehicle operator.
- V2X communications have enabled modern vehicles to communicate with data networks, proximate infrastructure, and other vehicles. These communications allow data to be exchanged, crowdsourced and analyzed to provide more information to these vehicles than was ever available before.
- SPaT signal phase and timing
- traffic signal controllers enables traffic signal controllers to provide additional information to proximate vehicles, such as current light states for each lane of an intersection and time to state change for the lights. This information allows the vehicle to provide additional information and warnings to the driver about conditions that may not be readily apparent.
- a problem often arises in that a vehicle operators view may be blocked by other proximate vehicles. This problem is exasperated in the case where the vehicle is a smaller vehicle, such as a car, and the proximate vehicles are much larger, such as transport trucks or the like.
- a particular problem arises when a vehicle operator cannot view traffic signals when the view of the traffic signal is blocked by a larger vehicle. This problem may even exist for a vehicle equipped with advanced driver-assistance systems (ADAS) in that the vehicle may not be able to determine a state of the traffic signal using vehicle cameras and sensors.
- ADAS advanced driver-assistance systems
- vehicle sensor methods and systems and related control logic for provisioning vehicle systems, methods for making and methods for operating such systems, and motor vehicles equipped with onboard control systems.
- an apparatus including a camera for capturing an image wherein the image includes a representation of a leading vehicle, a receiver configured to receive a traffic signal location and a traffic signal cycle state, a processor configured to estimate a leading vehicle dimension, a first distance between a host vehicle and the leading vehicle in response to the image, a second distance between the host vehicle and the traffic signal, estimating a traffic signal view obstruction in response to the leading vehicle dimension, the first distance, and the second distance, and generating a graphical user interface in response to the traffic signal view obstruction, and a display configured to display the graphical user interface wherein the graphical user interface is indicative of the traffic signal cycle state.
- a global positioning system for receiving a data indicative of a host vehicle location.
- a memory for storing a map data indicative of a traffic signal location.
- the display is a heads up display.
- the traffic signal view obstruction is estimated in response to a portion of the leading vehicle being within a line of sight between the host vehicle and the traffic signal.
- a lidar configured to capture a depth map indicative of the leading vehicle dimension and the first distance.
- the receiver is a SPaT receiver and the traffic signal location and the traffic signal cycle state are indicated in a SPaT message.
- a vehicle controller indicative of a host vehicle speed and wherein the processor is further configured to determine a future traffic signal view obstruction in response to the host vehicle speed, the leading vehicle dimension, the first distance, and the second distance and for generating a driver alert in response to the future traffic signal obstruction.
- the graphical user interface is an augmented reality representation of a traffic signal indicative of the traffic signal cycle state and wherein the display is an augmented reality heads up display.
- a method including capturing, by a host vehicle camera, an image of a leading vehicle, estimating a leading vehicle dimension and a first distance between a host vehicle and the leading vehicle in response to the image, receiving a data indicative of a traffic signal location and a traffic signal cycle state, determining a second distance between the host vehicle and the traffic signal, estimating a traffic signal view obstruction in response to the leading vehicle dimension, the first distance, and the second distance, and generating a graphical user interface indicative of the traffic signal cycle state in response to the traffic signal view obstruction.
- determining a host vehicle location in response to a location data received via a global positioning system.
- displaying the graphical user interface to a vehicle operator via a heads up display displaying the graphical user interface to a vehicle operator via a heads up display.
- the data indicative of the traffic signal location is stored in a memory in the host vehicle.
- the traffic signal view obstruction is estimated in response to a portion of the leading vehicle being within a line of sight between the host vehicle and the traffic signal.
- leading vehicle dimension and the first distance between a host vehicle and the leading vehicle are determined in response to a depth map generated by a host vehicle lidar system.
- the data indicative of the traffic signal location and the traffic signal cycle state is received via a SPaT data message received by a SPaT data receiver.
- determining a future traffic signal view obstruction in response to a host vehicle speed, the leading vehicle dimension, the first distance, and the second distance and for generating a driver alert in response to the future traffic signal obstruction determining a future traffic signal view obstruction in response to a host vehicle speed, the leading vehicle dimension, the first distance, and the second distance and for generating a driver alert in response to the future traffic signal obstruction.
- the graphical user interface is an augmented reality representation of a traffic signal indicative of the traffic signal cycle state and is presented to a vehicle operator on an augmented reality heads up display.
- a vehicle control system including a global positioning system configured for receiving a location data indicative of a host vehicle location, a host vehicle sensor for detecting a leading vehicle dimension and a first distance between a host vehicle and the leading vehicle, a receiver configured to receive a traffic signal data indicative of a traffic signal location and a traffic signal cycle state, a processor configured to determine a traffic signal view obstruction between the host vehicle and a traffic signal in response to the host vehicle location, the first distance, the traffic signal location and the leading vehicle dimension, and a display configured to display a representation of the traffic signal in response to the determination of the traffic signal view obstruction and the traffic signal cycle state.
- a vehicle controller for detecting a host vehicle speed and wherein the processor is further configured to determine a future traffic signal view obstruction in response to the host vehicle speed, the leading vehicle dimension, the first distance, and the traffic signal location and for generating a driver alert in response to the future traffic signal obstruction.
- FIG. 1 shows an exemplary environment for use of the traffic light visibility detection and augmented display system according to an exemplary embodiment of the present disclosure
- FIG. 2 shows a block diagram illustrating a system for implementing the traffic light visibility detection and augmented display system in a motor vehicle according to an exemplary embodiment of the present disclosure
- FIG. 3 shows a view of an exemplary graphical user interface presented in an AR HUD according to an embodiment of the present disclosure
- FIG. 4 shows a flow chart illustrating an exemplary method for performing the traffic light visibility detection and augmented display according to an exemplary embodiment of the present disclosure
- FIG. 5 shows another block diagram illustrating a system for implementing the traffic light visibility detection and augmented display system in a motor vehicle according to an exemplary embodiment of the present disclosure
- FIG. 6 shows another flow chart illustrating an exemplary method for performing the traffic light visibility detection and augmented display according to an exemplary embodiment of the present disclosure.
- module refers to an application specific integrated circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and memory that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
- ASIC application specific integrated circuit
- processor shared, dedicated, or group
- memory that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
- FIG. 1 an exemplary environment 100 for use of the traffic light visibility detection and augmented display system according to an exemplary embodiment of the present disclosure is shown.
- the exemplary environment 100 depicts a road surface 107 leading to traffic light 105 having a stop line 109 , and a first vehicle 110 , and a host vehicle 120 .
- the first vehicle 110 and the host vehicle 120 are depicted as waiting for a change of state of the traffic light 105 .
- the vehicle operator's view 125 of the traffic signal 105 from the host vehicle 120 is obstructed by the first vehicle 110 .
- a camera system of an ADAS equipped vehicle may also not detect the light because the view 125 is blocked.
- An exemplary system is proposed to assist a vehicle operator and/or ADAS sensor in determining a traffic signal state.
- the system is first configured to determine a location of the traffic signal 105 .
- the location may be determined in response to map data, data transmitted from the traffic signal to the host vehicle via a vehicle to infrastructure (V2I) communications and/or in response to an image captured by a host vehicle camera.
- V2I vehicle to infrastructure
- the system may next estimate a width and/or height of the first vehicle 110 in front of the host vehicle 120 using an image captured by a host vehicle camera or in response to a Lidar depth map or the like.
- the dimensions of the first vehicle 110 may also be determined in response to data transmitted from the first vehicle 110 to the host vehicle via a vehicle to vehicle (V2V) communications.
- the first vehicle 110 may be configured to transmit a set of dimensions of the first vehicle 110 in response to a request from the host vehicle 120 .
- the exemplary system may next be configured to determine if the first vehicle 110 height or width is larger than a threshold value wherein the threshold value is indicative of a high probability of an obstructed view for the host vehicle 120 .
- the exemplary system may then adjust the longitudinal distance between the first vehicle 110 and the host vehicle 120 with vehicle speed, vehicle height, and distance to the intersection with the traffic light such that the traffic light will fall into the field of view 125 of the host vehicle.
- the system may then receive a traffic signal light state from the traffic signal via a V2I communications and display the traffic light signal state as a graphic on a heads up display (HUD) or other driver interface.
- HUD heads up display
- the exemplary system 200 may include an antenna 205 , a Signal Phase and Timing (SPaT) receiver 210 , a telemetrics module 215 , a processor 220 , a driver information center 230 , an augmented reality (AR) heads up device (HUD) 225 , a driver alert system 235 , an object detection system 245 , and a camera 255 .
- SPaT Signal Phase and Timing
- AR augmented reality
- HUD head up device
- the SPaT receiver 210 may be configured to receive SPaT messages from a vehicle to infrastructure (V2I) transmitter via the antenna 205 .
- the SPaT message may define the current intersection signal light phases and current state of all lanes at the intersection.
- the data received via the SPaT message may then be coupled to the telemetrics module 215 for processing and coupling to the processor 220 .
- the telemetrics module 215 is configured to provide wireless connectivity between the host vehicle, other vehicles, infrastructure and data networks.
- the telemetrics module may include a plurality of antennas, modulators, demodulators, signal processors, and the like to process, transmit, and receive radio frequency signals carrying data for use by the vehicle, such as system updates, updated map data, infotainment system data and the like.
- the telemetrics module 215 may further include a GPS receiver for receiving GPS satellite signals used for determining a host vehicle location.
- the object detection system 245 may be a lidar system operative to determine a distance to a proximate object, such as a leading vehicle.
- a lidar system may transmit a light pulse in a known direction and determine a distance to an object at the known direction in response to a propagation time of the light pulse. The lidar system may then repeat this operation a multiple known directions to generate a depth map representative of objects within the field of view. The depth information within the depth map may then be used to determine a height, width, and distance of a leading vehicle in front of the host vehicle.
- the object detection system 245 may be a radar system, infrared detection system or the like for performing a similar function.
- the camera 255 may be used to capture an image of a field of view from the host vehicle.
- the camera 255 may capture an image or a plurality of images at periodic time intervals for a field of view in front of the host vehicle.
- the images may depict a leading vehicle and be used to determine dimensions of the leading vehicle.
- the processor 220 is first configured to determine a location of the host vehicle and a location of an upcoming traffic signal.
- the location of the host vehicle may be determined in response to GPS data received from the telemetrics module 215 and/or map data or other vehicle control data.
- the processor 220 determines the location of the upcoming traffic signal in response to map data stored in a memory, in response to image data captured by the camera 255 and/or in response to data, such as a SPaT message received from the traffic signal or other proximate infrastructure.
- the SPaT message may be indicative of the traffic signal location, current states of the traffic signal for each lane in the intersection and time to a state change of each light in the traffic signal.
- the processor 220 may next be configured to determine a distance between the host vehicle and the traffic signal in response to the traffic signal location and the host vehicle location.
- the processor 220 may next determine dimensions of a leading vehicle in response to the image data and/or the object detection system data. Dimensions of a leading vehicle may be determined in response to one or more of the images, using image processing techniques and the like. If the dimensions of the leading vehicle are greater than a threshold value, indicative of a possibility of an obstructed view of the traffic signal, the processor 220 next calculates a minimum distance to the leading vehicle where the traffic signal may still be visible to the host vehicle operator.
- the processor may be configured to generate a graphical user interface indicative of a current state of the traffic signal, such as a red light, and optionally a time to state change of the traffic signal. This graphical user interface may then be displayed on an AR HUD or other presentation display to the vehicle operator.
- the processor 220 may determine a vehicle speed of the host vehicle and/or a vehicle speed of the leading vehicle. The host vehicle speed may be determined in response to GPS data or in response to data from the vehicle controller 232 . The processor 220 may next determine if a distance between the host vehicle and the leading vehicle is approaching the minimum distance. If the minimum distance is being approached, the processor 220 may generate a driver alert to be provided to the driver alert system 235 , the driver information center 230 and/or the AR HUD 225 . The driver alert system 235 may be configured to provide an alert to a vehicle operation that the threshold distance to the first vehicle will be soon reached and therefore views to the traffic signal may be obstructed.
- the driver alert may be a flashing traffic signal graphic or the like.
- the processor 220 may generate a control signal instructing the vehicle controller 232 or ADAS controller to reduce the host vehicle speed such that at least the minimum distance is maintained between the host vehicle and the leading vehicle.
- the processor may be configured to generate a graphical user interface indicative of a current state of the traffic signal, such as a red light, and optionally a time to state change of the traffic signal.
- This graphical user interface may then be displayed on an AR HUD or presentation to the vehicle operator.
- FIG. 3 illustrates a view 300 of an exemplary graphical user interface presented in an AR HUD.
- the exemplary view illustrates an augmented reality graphic 305 of the traffic signal with the current signal state correctly displayed.
- the graphic 305 may be displayed in a location to the driver approximately where the traffic signal would be visible to the driver if not obstructed by the leading vehicle.
- the graphic may include a countdown timer 310 indicative of an upcoming time of a next state change of the traffic signal.
- the exemplary method is first operative to detect 410 an upcoming traffic signal.
- the upcoming traffic signal may be detected in response to map data, in response to a host vehicle image or sensor data, and/or in response to reception of a SPaT message or the like.
- the SPaT message may be received via vehicle to infrastructure (V2I) communications or other wireless communications network.
- the SPaT message may be transmitted periodically, such as every 100 ms.
- the SPaT message may be indicative of a location of the traffic signal, a current phase of the traffic signal for every traffic lane of an intersection and/or a time remaining of the phase for every lane.
- the method is next configured to calculate 415 a distance of the host vehicle to the traffic signal.
- the distance may be calculated in response to a host vehicle location determined in response to GPS data, map data, image data or the like, and the detected traffic signal location.
- the method is next configured to determine 420 if a leading vehicle is present. The presence of the leading vehicle may be determined in response to image data captured by a host vehicle camera, in response to V2V communications between the host vehicle and the leading vehicle, and/or in response to host vehicle sensor data, such as a lidar depth map or the like. If no leading vehicle is present, the method may return to detecting 410 a traffic signal location.
- the method is next configured to determine 425 dimensions of the leading vehicle.
- the dimensions of the leading vehicle may be determined in response to performing image processing techniques on an image captured by a host vehicle camera, in response to host vehicle sensor data, such as lidar, or in response to V2V communications with the leading vehicle.
- the leading vehicle may transmit data indicative of the leading vehicle location, velocity and dimensions. This data may be transmitted in response to a request by the host vehicle and/or periodically by the leading vehicle.
- the host vehicle may determine the height and width of the leading vehicle.
- the host vehicle may further determine a lateral position of the leading vehicle within the vehicle lane.
- the leading vehicle dimensions are next compared 430 to a threshold.
- the threshold may be indicative that the leading vehicle may potentially block the view of the traffic signal to the host vehicle sensor or a host vehicle operator. If the dimensions of the leading vehicle do not exceed the threshold, the method may return to detecting 410 a traffic signal location. If the leading vehicle dimensions exceed the threshold, the method is next configured to determine a distance 435 to the leading vehicle. The distance to the leading vehicle may be determined in response to host vehicle sensor data, such as image or lidar depth map data, and/or location data of the host vehicle and the leading vehicle.
- the distance is then compared 440 to a distance threshold.
- the distance threshold may be indicative of a distance behind the leading vehicle at which the host vehicle operator and/or sensors may lose visibility of the traffic signal.
- This distance threshold may be calculated in response to a distance from the host vehicle to the traffic signal, dimensions of the leading vehicle, and height and location of the traffic signal. As the host vehicle approaches the leading vehicle, the host vehicle operator's view of the traffic signal may become obstructed.
- the distance threshold may be dependent on the host vehicle speed and the leading vehicle speed and location.
- the method may be configured to display 445 an indication of the traffic signal state to the vehicle operator.
- This indication may be an augmented reality traffic signal displayed on an HUD or the like.
- the indication may be displayed on a vehicle user interface, such as a center stack display, an instrument cluster display, or the like.
- the method next determines 450 if the distance is approaching the threshold distance.
- the distance may be approaching the threshold distance if the host vehicle is travelling faster than the leading vehicle. If the distance is not approaching the threshold distance, the method may return to detecting 410 a traffic signal location. If the distance is approaching the threshold distance, the method may next issue 455 a driver alert indicative to the vehicle operator that the traffic signal may soon not be visible to the vehicle operator, or may not be detectable by the host vehicle sensors.
- This driver alert may be a flashing light, a chime, buzzer or other audible alarm, or other graphic presented on a graphical user interface.
- the vehicle operator or the ADAS may reduce the vehicle speed to ensure that at least the threshold distance is maintained between the leading vehicle and the host vehicle such that the traffic signal remains visible.
- the exemplary system 500 may include a camera 510 , a receiver 520 , a processor 530 , a display 540 , a GPS 550 , a memory 560 , a lidar 570 and a vehicle controller 580 .
- the camera 510 may be integral to the host vehicle and may be configured for capturing an image wherein the image includes a representation of a leading vehicle.
- the camera 510 may be a forward-facing camera for capturing a forward field of view from the host vehicle.
- the camera 510 may capture a series of images at periodic time intervals.
- the images may then be coupled to an image processor, or the processor 530 , for analysis using image processing techniques.
- the images may be used to determine the leading vehicle location, distance between the leading vehicle and the host vehicle, distance and location of the traffic signal, dimensions of the leading vehicle and dimensions associated with the traffic signal, such as height off of the ground and the like.
- the receiver 520 may be a radio frequency receiver or other wireless information receiver configured to receive data indicative of a traffic signal location and a traffic signal cycle state.
- the data may be transmitted from a traffic signal controller via a V2I communications channel or network.
- the receiver 520 may be a SPaT receiver and the traffic signal location and the traffic signal cycle state are indicated in a SPaT data message.
- the processor 530 may be a vehicle control processor, advanced driving assist processor or the like, configured to generate an augmented reality representation of the traffic signal including the traffic signal cycle state and may further include a countdown timer indicative of a time remaining in the traffic signal state cycle.
- the processor 530 may be configured to determine a leading vehicle dimension, a first distance between a host vehicle and the leading vehicle in response to the image captured by the camera 510 .
- the processor 530 may next determine a second distance between the host vehicle and the traffic signal in response to a map data or the like.
- the processor 530 may then estimate a traffic signal view obstruction in response to the leading vehicle dimension, the first distance, and the second distance.
- the traffic signal view obstruction may be indicative of the leading vehicle obstructing a line of sight between a host vehicle operator or host vehicle sensor and the traffic signal.
- the traffic signal view obstruction may be estimated in response to a height of the leading vehicle being within a line of sight between the host vehicle and the traffic signal.
- the processor 530 may then generate a graphical user interface in response to the traffic signal view obstruction.
- the display 540 may be configured to receive the graphical user interface from the processor 530 and to display the graphical user interface to a vehicle operator on a display within the host vehicle.
- the graphical user interface displayed to the vehicle operator may be indicative of the traffic signal cycle state.
- the graphical user interface may also display a countdown timer indicative of a time remaining until an upcoming state change of the traffic signal.
- the display 540 may be a heads up display configured for presenting an augmented reality representation of the traffic signal and the traffic signal cycle state to a vehicle operator.
- the exemplary system 500 may further include GPS 550 for receiving a data from one or more satellites and estimating a host vehicle location in response to the received data. The GPS 550 may then couple GPS data to the processor 530 indicative of a host vehicle location.
- a memory 560 may be communicatively coupled to the processor 530 for storing a map data. The map data may be indicative of a traffic signal location.
- the system may further include a lidar 570 configured to capture a depth map indicative of the leading vehicle dimension and the first distance. The depth map may be used to determine dimensions of a leading vehicle and/or distances from the host vehicle to the leading vehicle and/or the traffic signal or the like.
- the system 500 may further include a vehicle controller 580 for controlling one or more of the steering, throttle and/or braking of the host vehicle.
- the vehicle controller 580 may be configured to perform ADAS algorithms.
- the vehicle controller 580 may further provide data indicative of a host vehicle speed and wherein the processor is further configured to determine a future traffic signal view obstruction in response to the host vehicle speed, the leading vehicle dimension, the first distance, and the second distance and for generating a driver alert in response to the future traffic signal obstruction.
- the system 500 may be an advanced driver-assistance system including a GPS 550 configured for receiving a location data indicative of a host vehicle location.
- a host vehicle sensor may include a camera and a lidar for capturing an image. The image may be used for detecting a leading vehicle dimension and a first distance between a host vehicle and the leading vehicle.
- the system may include a receiver 520 configured to receive a traffic signal data indicative of a traffic signal location and a traffic signal cycle state.
- a processor 530 may be configured to determine a traffic signal view obstruction between the host vehicle and a traffic signal in response to the host vehicle location, the first distance, the traffic signal location and the leading vehicle dimension.
- the system 500 may further include a display 540 configured to visually present a representation of the traffic signal to a vehicle operator in response to the determination of the traffic signal view obstruction and the traffic signal cycle state.
- the exemplary system may further include a vehicle controller 580 for detecting a host vehicle speed.
- the processor 530 may be further configured to determine a future traffic signal view obstruction in response to the host vehicle speed, the leading vehicle dimension, the first distance, and the traffic signal location and for generating a driver alert in response to the future traffic signal obstruction.
- FIG. 6 a flow chart illustrating another exemplary method 800 for implementing a traffic light countdown notification and alert suppression system in a motor vehicle according to an exemplary embodiment of the present disclosure is shown.
- the method is first operative for capturing 610 , by a host vehicle camera, an image of a leading vehicle.
- the image may be a depth map captured by a lidar system.
- the method is next operative for determining 620 a host vehicle location in response to a location data received via a GPS or global navigation satellite system (GNSS).
- the host vehicle location may be indicated by the received data and may be confirmed in response to map data stored within a memory and correlation with landmarks or other indicators proximate to the host vehicle and detected with other host vehicle sensors.
- GNSS global navigation satellite system
- the method next estimates 630 a leading vehicle dimension and a first distance between a host vehicle and the leading vehicle in response to the image.
- Image processing techniques in addition to multiple images captured at different time intervals may be used to determine a leading vehicle dimension, such as height or width.
- the distance from the host vehicle to the leading vehicle may be estimated using multiple images which may vary in time and or space.
- the leading vehicle dimension and the first distance between a host vehicle and the leading vehicle may be determined in response to a depth map generated by a host vehicle lidar system.
- the system may then receive 640 a data indicative of a traffic signal location and a traffic signal cycle state, the data indicative of the traffic signal location is stored in a memory in the host vehicle.
- the data may be received from the traffic signal controller or the like via a V2I communications channel.
- the V2I communications channel may be a wireless radio frequency channel.
- the data indicative of the traffic signal location and the traffic signal cycle state may be received via a SPaT data message received by a SPaT data receiver.
- the method next determines 650 a second distance between the host vehicle and the traffic signal.
- the location of the host vehicle may be determined in response to the GPS or GNSS data.
- the location of the traffic signal may be determined in response to map data or may be determined in response to sensor data such as a camera image or a lidar depth map.
- the method is next configured for determining 660 a traffic signal view obstruction in response to the leading vehicle dimension, the first distance, and the second distance.
- the traffic signal view obstruction may be estimated in response to a portion of the leading vehicle being within a line of sight between the host vehicle and the traffic signal.
- the obstruction may be determined using geometric techniques to determine an intersection between the height and or width of the leading vehicle in a line of sight from the vehicle operator to the traffic signal.
- the method may further determine a future traffic signal view obstruction in response to a host vehicle speed, the leading vehicle dimension, the first distance, and the second distance and for generating a driver alert in response to the future traffic signal obstruction.
- the method next generates 670 a graphical user interface indicative of the traffic signal cycle state in response to the traffic signal view obstruction.
- the graphical user interface may include a graphical representation of the traffic signal including the current traffic signal cycle state, such as red or green, as well as a countdown timer indicative of a remaining time in the current traffic signal cycle state.
- the method is then configured for displaying 680 the graphical user interface to a vehicle operator via a heads up display.
- the graphical user interface may be an augmented reality representation of a traffic signal indicative of the traffic signal cycle state and is presented to a vehicle operator on an augmented reality heads up display.
- the graphical user interface may be a display screen within the vehicle cabin viewable by the vehicle operator, such as a center stack display, instrument cluster display, dashboard display or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Theoretical Computer Science (AREA)
- Atmospheric Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Geometry (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Traffic Control Systems (AREA)
Abstract
A method for capturing, by a host vehicle camera, an image of a leading vehicle, estimating a leading vehicle dimension and a first distance between a host vehicle and the leading vehicle in response to the image, receiving a data indicative of a traffic signal location and a traffic signal cycle state, determining a second distance between the host vehicle and the traffic signal, estimating a traffic signal view obstruction in response to the leading vehicle dimension, the first distance, and the second distance, and generating a graphical user interface indicative of the traffic signal cycle state in response to the traffic signal view obstruction.
Description
This application claims the benefit of Chinese Application No. 202111268575.8, filed Oct. 29, 2021 which is herein incorporated by reference in its entirety.
The present disclosure relates generally to a system for providing traffic signal information to a driver within a motor vehicle. More specifically, aspects of the present disclosure relate to systems, methods and devices for determining an operator visibility of a traffic signal, controlling a vehicle in order to enhance the operator visibility and providing a traffic signal state information to a vehicle operator.
Communications systems, such as vehicle to everything (V2X) communications have enabled modern vehicles to communicate with data networks, proximate infrastructure, and other vehicles. These communications allow data to be exchanged, crowdsourced and analyzed to provide more information to these vehicles than was ever available before. For example, using signal phase and timing (SPaT) messaging enables traffic signal controllers to provide additional information to proximate vehicles, such as current light states for each lane of an intersection and time to state change for the lights. This information allows the vehicle to provide additional information and warnings to the driver about conditions that may not be readily apparent.
When operating a vehicle, a problem often arises in that a vehicle operators view may be blocked by other proximate vehicles. This problem is exasperated in the case where the vehicle is a smaller vehicle, such as a car, and the proximate vehicles are much larger, such as transport trucks or the like. A particular problem arises when a vehicle operator cannot view traffic signals when the view of the traffic signal is blocked by a larger vehicle. This problem may even exist for a vehicle equipped with advanced driver-assistance systems (ADAS) in that the vehicle may not be able to determine a state of the traffic signal using vehicle cameras and sensors. In addition, if a vehicle operator is unsure of a traffic signal state and the ADAS equipped vehicle begins moving, the operator may unnecessarily disengage the ADAS system. It would be desirable to provide a traffic light notification system to a vehicle operator while overcoming the aforementioned problems.
The above information disclosed in this background section is only for enhancement of understanding of the background of the invention and therefore it may contain information that does not form the prior art that is already known in this country to a person of ordinary skill in the art.
Disclosed herein are vehicle sensor methods and systems and related control logic for provisioning vehicle systems, methods for making and methods for operating such systems, and motor vehicles equipped with onboard control systems. By way of example, and not limitation, there are presented various embodiments of systems for the accurate determination of a traffic signal view obstruction and appropriate driver notification of traffic light state changes and a method for performing a driver view analysis and traffic light notification in a motor vehicle herein.
In accordance with an aspect of the present disclosure, an apparatus including a camera for capturing an image wherein the image includes a representation of a leading vehicle, a receiver configured to receive a traffic signal location and a traffic signal cycle state, a processor configured to estimate a leading vehicle dimension, a first distance between a host vehicle and the leading vehicle in response to the image, a second distance between the host vehicle and the traffic signal, estimating a traffic signal view obstruction in response to the leading vehicle dimension, the first distance, and the second distance, and generating a graphical user interface in response to the traffic signal view obstruction, and a display configured to display the graphical user interface wherein the graphical user interface is indicative of the traffic signal cycle state.
In accordance with an aspect of the present disclosure, a global positioning system for receiving a data indicative of a host vehicle location.
In accordance with an aspect of the present disclosure, a memory for storing a map data indicative of a traffic signal location.
In accordance with an aspect of the present disclosure, wherein the display is a heads up display.
In accordance with an aspect of the present disclosure, wherein the traffic signal view obstruction is estimated in response to a portion of the leading vehicle being within a line of sight between the host vehicle and the traffic signal.
In accordance with an aspect of the present disclosure, a lidar configured to capture a depth map indicative of the leading vehicle dimension and the first distance.
In accordance with an aspect of the present disclosure, wherein the receiver is a SPaT receiver and the traffic signal location and the traffic signal cycle state are indicated in a SPaT message.
In accordance with an aspect of the present disclosure, a vehicle controller indicative of a host vehicle speed and wherein the processor is further configured to determine a future traffic signal view obstruction in response to the host vehicle speed, the leading vehicle dimension, the first distance, and the second distance and for generating a driver alert in response to the future traffic signal obstruction.
In accordance with an aspect of the present disclosure, wherein the graphical user interface is an augmented reality representation of a traffic signal indicative of the traffic signal cycle state and wherein the display is an augmented reality heads up display.
In accordance with an aspect of the present disclosure, a method including capturing, by a host vehicle camera, an image of a leading vehicle, estimating a leading vehicle dimension and a first distance between a host vehicle and the leading vehicle in response to the image, receiving a data indicative of a traffic signal location and a traffic signal cycle state, determining a second distance between the host vehicle and the traffic signal, estimating a traffic signal view obstruction in response to the leading vehicle dimension, the first distance, and the second distance, and generating a graphical user interface indicative of the traffic signal cycle state in response to the traffic signal view obstruction.
In accordance with an aspect of the present disclosure, determining a host vehicle location in response to a location data received via a global positioning system.
In accordance with an aspect of the present disclosure, displaying the graphical user interface to a vehicle operator via a heads up display.
In accordance with an aspect of the present disclosure, wherein the data indicative of the traffic signal location is stored in a memory in the host vehicle.
In accordance with an aspect of the present disclosure, wherein the traffic signal view obstruction is estimated in response to a portion of the leading vehicle being within a line of sight between the host vehicle and the traffic signal.
In accordance with an aspect of the present disclosure, wherein the leading vehicle dimension and the first distance between a host vehicle and the leading vehicle are determined in response to a depth map generated by a host vehicle lidar system.
In accordance with an aspect of the present disclosure, wherein the data indicative of the traffic signal location and the traffic signal cycle state is received via a SPaT data message received by a SPaT data receiver.
In accordance with an aspect of the present disclosure, determining a future traffic signal view obstruction in response to a host vehicle speed, the leading vehicle dimension, the first distance, and the second distance and for generating a driver alert in response to the future traffic signal obstruction.
In accordance with an aspect of the present disclosure, wherein the graphical user interface is an augmented reality representation of a traffic signal indicative of the traffic signal cycle state and is presented to a vehicle operator on an augmented reality heads up display.
In accordance with an aspect of the present disclosure, a vehicle control system including a global positioning system configured for receiving a location data indicative of a host vehicle location, a host vehicle sensor for detecting a leading vehicle dimension and a first distance between a host vehicle and the leading vehicle, a receiver configured to receive a traffic signal data indicative of a traffic signal location and a traffic signal cycle state, a processor configured to determine a traffic signal view obstruction between the host vehicle and a traffic signal in response to the host vehicle location, the first distance, the traffic signal location and the leading vehicle dimension, and a display configured to display a representation of the traffic signal in response to the determination of the traffic signal view obstruction and the traffic signal cycle state.
In accordance with an aspect of the present disclosure, a vehicle controller for detecting a host vehicle speed and wherein the processor is further configured to determine a future traffic signal view obstruction in response to the host vehicle speed, the leading vehicle dimension, the first distance, and the traffic signal location and for generating a driver alert in response to the future traffic signal obstruction.
The above advantage and other advantages and features of the present disclosure will be apparent from the following detailed description of the preferred embodiments when taken in connection with the accompanying drawings.
The exemplary embodiments will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and wherein:
The exemplifications set out herein illustrate preferred embodiments of the invention, and such exemplifications are not to be construed as limiting the scope of the invention in any manner.
The following detailed description is merely exemplary in nature and is not intended to limit the application and uses. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the following detailed description. As used herein, the term module refers to an application specific integrated circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and memory that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
Turning now to FIG. 1 , an exemplary environment 100 for use of the traffic light visibility detection and augmented display system according to an exemplary embodiment of the present disclosure is shown. The exemplary environment 100 depicts a road surface 107 leading to traffic light 105 having a stop line 109, and a first vehicle 110, and a host vehicle 120.
In this exemplary embodiment, the first vehicle 110 and the host vehicle 120 are depicted as waiting for a change of state of the traffic light 105. The vehicle operator's view 125 of the traffic signal 105 from the host vehicle 120 is obstructed by the first vehicle 110. A camera system of an ADAS equipped vehicle may also not detect the light because the view 125 is blocked.
An exemplary system is proposed to assist a vehicle operator and/or ADAS sensor in determining a traffic signal state. The system is first configured to determine a location of the traffic signal 105. The location may be determined in response to map data, data transmitted from the traffic signal to the host vehicle via a vehicle to infrastructure (V2I) communications and/or in response to an image captured by a host vehicle camera.
The system may next estimate a width and/or height of the first vehicle 110 in front of the host vehicle 120 using an image captured by a host vehicle camera or in response to a Lidar depth map or the like. The dimensions of the first vehicle 110 may also be determined in response to data transmitted from the first vehicle 110 to the host vehicle via a vehicle to vehicle (V2V) communications. For example, the first vehicle 110 may be configured to transmit a set of dimensions of the first vehicle 110 in response to a request from the host vehicle 120.
The exemplary system may next be configured to determine if the first vehicle 110 height or width is larger than a threshold value wherein the threshold value is indicative of a high probability of an obstructed view for the host vehicle 120. The exemplary system may then adjust the longitudinal distance between the first vehicle 110 and the host vehicle 120 with vehicle speed, vehicle height, and distance to the intersection with the traffic light such that the traffic light will fall into the field of view 125 of the host vehicle. Alternatively, if the distance between the first vehicle 110 and the host vehicle 120 is less than the required longitudinal distance to the first vehicle 120 such that the host vehicle view 125 is likely obstructed to the traffic signal, the system may then receive a traffic signal light state from the traffic signal via a V2I communications and display the traffic light signal state as a graphic on a heads up display (HUD) or other driver interface.
Turning now to FIG. 2 , a block diagram illustrating a system 200 for implementing the traffic light visibility detection and augmented display system in a motor vehicle according to an exemplary embodiment of the present disclosure is shown. The exemplary system 200 may include an antenna 205, a Signal Phase and Timing (SPaT) receiver 210, a telemetrics module 215, a processor 220, a driver information center 230, an augmented reality (AR) heads up device (HUD) 225, a driver alert system 235, an object detection system 245, and a camera 255.
The SPaT receiver 210 may be configured to receive SPaT messages from a vehicle to infrastructure (V2I) transmitter via the antenna 205. The SPaT message may define the current intersection signal light phases and current state of all lanes at the intersection. The data received via the SPaT message may then be coupled to the telemetrics module 215 for processing and coupling to the processor 220. The telemetrics module 215 is configured to provide wireless connectivity between the host vehicle, other vehicles, infrastructure and data networks. The telemetrics module may include a plurality of antennas, modulators, demodulators, signal processors, and the like to process, transmit, and receive radio frequency signals carrying data for use by the vehicle, such as system updates, updated map data, infotainment system data and the like. The telemetrics module 215 may further include a GPS receiver for receiving GPS satellite signals used for determining a host vehicle location.
The object detection system 245 may be a lidar system operative to determine a distance to a proximate object, such as a leading vehicle. A lidar system may transmit a light pulse in a known direction and determine a distance to an object at the known direction in response to a propagation time of the light pulse. The lidar system may then repeat this operation a multiple known directions to generate a depth map representative of objects within the field of view. The depth information within the depth map may then be used to determine a height, width, and distance of a leading vehicle in front of the host vehicle. Alternatively, the object detection system 245 may be a radar system, infrared detection system or the like for performing a similar function.
The camera 255 may be used to capture an image of a field of view from the host vehicle. For example, the camera 255 may capture an image or a plurality of images at periodic time intervals for a field of view in front of the host vehicle. The images may depict a leading vehicle and be used to determine dimensions of the leading vehicle.
The processor 220 is first configured to determine a location of the host vehicle and a location of an upcoming traffic signal. The location of the host vehicle may be determined in response to GPS data received from the telemetrics module 215 and/or map data or other vehicle control data. The processor 220 then determines the location of the upcoming traffic signal in response to map data stored in a memory, in response to image data captured by the camera 255 and/or in response to data, such as a SPaT message received from the traffic signal or other proximate infrastructure. The SPaT message may be indicative of the traffic signal location, current states of the traffic signal for each lane in the intersection and time to a state change of each light in the traffic signal.
The processor 220 may next be configured to determine a distance between the host vehicle and the traffic signal in response to the traffic signal location and the host vehicle location. The processor 220 may next determine dimensions of a leading vehicle in response to the image data and/or the object detection system data. Dimensions of a leading vehicle may be determined in response to one or more of the images, using image processing techniques and the like. If the dimensions of the leading vehicle are greater than a threshold value, indicative of a possibility of an obstructed view of the traffic signal, the processor 220 next calculates a minimum distance to the leading vehicle where the traffic signal may still be visible to the host vehicle operator. If the minimum distance is greater than the actual distance between the leading vehicle and the host vehicle, the processor may be configured to generate a graphical user interface indicative of a current state of the traffic signal, such as a red light, and optionally a time to state change of the traffic signal. This graphical user interface may then be displayed on an AR HUD or other presentation display to the vehicle operator.
If the minimum distance to the leading vehicle has not been reached, the processor 220 may determine a vehicle speed of the host vehicle and/or a vehicle speed of the leading vehicle. The host vehicle speed may be determined in response to GPS data or in response to data from the vehicle controller 232. The processor 220 may next determine if a distance between the host vehicle and the leading vehicle is approaching the minimum distance. If the minimum distance is being approached, the processor 220 may generate a driver alert to be provided to the driver alert system 235, the driver information center 230 and/or the AR HUD 225. The driver alert system 235 may be configured to provide an alert to a vehicle operation that the threshold distance to the first vehicle will be soon reached and therefore views to the traffic signal may be obstructed. For example, the driver alert may be a flashing traffic signal graphic or the like. If the vehicle is an ADAS equipped vehicle and vehicle operation is currently being controlled by the ADAS system, the processor 220 may generate a control signal instructing the vehicle controller 232 or ADAS controller to reduce the host vehicle speed such that at least the minimum distance is maintained between the host vehicle and the leading vehicle.
If the minimum distance is greater than the actual distance between the leading vehicle and the host vehicle, the processor may be configured to generate a graphical user interface indicative of a current state of the traffic signal, such as a red light, and optionally a time to state change of the traffic signal. This graphical user interface may then be displayed on an AR HUD or presentation to the vehicle operator. FIG. 3 illustrates a view 300 of an exemplary graphical user interface presented in an AR HUD. The exemplary view illustrates an augmented reality graphic 305 of the traffic signal with the current signal state correctly displayed. In some embodiments, the graphic 305 may be displayed in a location to the driver approximately where the traffic signal would be visible to the driver if not obstructed by the leading vehicle. In addition, the graphic may include a countdown timer 310 indicative of an upcoming time of a next state change of the traffic signal.
Turning now to FIG. 4 , a flow chart illustrating an exemplary method 400 for implementing the traffic light visibility detection and augmented display system in a motor vehicle according to an exemplary embodiment of the present disclosure is shown. The exemplary method is first operative to detect 410 an upcoming traffic signal. The upcoming traffic signal may be detected in response to map data, in response to a host vehicle image or sensor data, and/or in response to reception of a SPaT message or the like. The SPaT message may be received via vehicle to infrastructure (V2I) communications or other wireless communications network. The SPaT message may be transmitted periodically, such as every 100 ms. The SPaT message may be indicative of a location of the traffic signal, a current phase of the traffic signal for every traffic lane of an intersection and/or a time remaining of the phase for every lane.
The method is next configured to calculate 415 a distance of the host vehicle to the traffic signal. The distance may be calculated in response to a host vehicle location determined in response to GPS data, map data, image data or the like, and the detected traffic signal location. The method is next configured to determine 420 if a leading vehicle is present. The presence of the leading vehicle may be determined in response to image data captured by a host vehicle camera, in response to V2V communications between the host vehicle and the leading vehicle, and/or in response to host vehicle sensor data, such as a lidar depth map or the like. If no leading vehicle is present, the method may return to detecting 410 a traffic signal location.
If a leading vehicle is present, the method is next configured to determine 425 dimensions of the leading vehicle. The dimensions of the leading vehicle may be determined in response to performing image processing techniques on an image captured by a host vehicle camera, in response to host vehicle sensor data, such as lidar, or in response to V2V communications with the leading vehicle. For example, using V2V communications, the leading vehicle may transmit data indicative of the leading vehicle location, velocity and dimensions. This data may be transmitted in response to a request by the host vehicle and/or periodically by the leading vehicle. In particular, the host vehicle may determine the height and width of the leading vehicle. The host vehicle may further determine a lateral position of the leading vehicle within the vehicle lane.
The leading vehicle dimensions are next compared 430 to a threshold. The threshold may be indicative that the leading vehicle may potentially block the view of the traffic signal to the host vehicle sensor or a host vehicle operator. If the dimensions of the leading vehicle do not exceed the threshold, the method may return to detecting 410 a traffic signal location. If the leading vehicle dimensions exceed the threshold, the method is next configured to determine a distance 435 to the leading vehicle. The distance to the leading vehicle may be determined in response to host vehicle sensor data, such as image or lidar depth map data, and/or location data of the host vehicle and the leading vehicle.
The distance is then compared 440 to a distance threshold. The distance threshold may be indicative of a distance behind the leading vehicle at which the host vehicle operator and/or sensors may lose visibility of the traffic signal. This distance threshold may be calculated in response to a distance from the host vehicle to the traffic signal, dimensions of the leading vehicle, and height and location of the traffic signal. As the host vehicle approaches the leading vehicle, the host vehicle operator's view of the traffic signal may become obstructed. The distance threshold may be dependent on the host vehicle speed and the leading vehicle speed and location.
If the distance to the leading vehicle is less than the distance threshold, indicating that the host vehicle view of the traffic signal may be obstructed by the leading vehicle, the method may be configured to display 445 an indication of the traffic signal state to the vehicle operator. This indication may be an augmented reality traffic signal displayed on an HUD or the like. Alternately, the indication may be displayed on a vehicle user interface, such as a center stack display, an instrument cluster display, or the like.
If the distance to the leading vehicle is greater than the distance threshold, the method next determines 450 if the distance is approaching the threshold distance. The distance may be approaching the threshold distance if the host vehicle is travelling faster than the leading vehicle. If the distance is not approaching the threshold distance, the method may return to detecting 410 a traffic signal location. If the distance is approaching the threshold distance, the method may next issue 455 a driver alert indicative to the vehicle operator that the traffic signal may soon not be visible to the vehicle operator, or may not be detectable by the host vehicle sensors. This driver alert may be a flashing light, a chime, buzzer or other audible alarm, or other graphic presented on a graphical user interface. In response to the driver alert, the vehicle operator or the ADAS may reduce the vehicle speed to ensure that at least the threshold distance is maintained between the leading vehicle and the host vehicle such that the traffic signal remains visible.
Turning now to FIG. 5 , a block diagram of a system 500 for traffic light visibility detection and augmented display according to an exemplary embodiment of the present disclosure is shown. The exemplary system 500 may include a camera 510, a receiver 520, a processor 530, a display 540, a GPS 550, a memory 560, a lidar 570 and a vehicle controller 580.
The camera 510 may be integral to the host vehicle and may be configured for capturing an image wherein the image includes a representation of a leading vehicle. In some exemplary embodiments, the camera 510 may be a forward-facing camera for capturing a forward field of view from the host vehicle. The camera 510 may capture a series of images at periodic time intervals. The images may then be coupled to an image processor, or the processor 530, for analysis using image processing techniques. The images may be used to determine the leading vehicle location, distance between the leading vehicle and the host vehicle, distance and location of the traffic signal, dimensions of the leading vehicle and dimensions associated with the traffic signal, such as height off of the ground and the like.
The receiver 520 may be a radio frequency receiver or other wireless information receiver configured to receive data indicative of a traffic signal location and a traffic signal cycle state. The data may be transmitted from a traffic signal controller via a V2I communications channel or network. In some exemplary embodiments, the receiver 520 may be a SPaT receiver and the traffic signal location and the traffic signal cycle state are indicated in a SPaT data message.
The processor 530 may be a vehicle control processor, advanced driving assist processor or the like, configured to generate an augmented reality representation of the traffic signal including the traffic signal cycle state and may further include a countdown timer indicative of a time remaining in the traffic signal state cycle. The processor 530 may be configured to determine a leading vehicle dimension, a first distance between a host vehicle and the leading vehicle in response to the image captured by the camera 510. The processor 530 may next determine a second distance between the host vehicle and the traffic signal in response to a map data or the like. The processor 530 may then estimate a traffic signal view obstruction in response to the leading vehicle dimension, the first distance, and the second distance. The traffic signal view obstruction may be indicative of the leading vehicle obstructing a line of sight between a host vehicle operator or host vehicle sensor and the traffic signal. For example, the traffic signal view obstruction may be estimated in response to a height of the leading vehicle being within a line of sight between the host vehicle and the traffic signal. The processor 530 may then generate a graphical user interface in response to the traffic signal view obstruction.
The display 540 may be configured to receive the graphical user interface from the processor 530 and to display the graphical user interface to a vehicle operator on a display within the host vehicle. The graphical user interface displayed to the vehicle operator may be indicative of the traffic signal cycle state. The graphical user interface may also display a countdown timer indicative of a time remaining until an upcoming state change of the traffic signal. In some exemplary embodiments, the display 540 may be a heads up display configured for presenting an augmented reality representation of the traffic signal and the traffic signal cycle state to a vehicle operator.
The exemplary system 500 may further include GPS 550 for receiving a data from one or more satellites and estimating a host vehicle location in response to the received data. The GPS 550 may then couple GPS data to the processor 530 indicative of a host vehicle location. A memory 560 may be communicatively coupled to the processor 530 for storing a map data. The map data may be indicative of a traffic signal location. The system may further include a lidar 570 configured to capture a depth map indicative of the leading vehicle dimension and the first distance. The depth map may be used to determine dimensions of a leading vehicle and/or distances from the host vehicle to the leading vehicle and/or the traffic signal or the like.
The system 500 may further include a vehicle controller 580 for controlling one or more of the steering, throttle and/or braking of the host vehicle. The vehicle controller 580 may be configured to perform ADAS algorithms. The vehicle controller 580 may further provide data indicative of a host vehicle speed and wherein the processor is further configured to determine a future traffic signal view obstruction in response to the host vehicle speed, the leading vehicle dimension, the first distance, and the second distance and for generating a driver alert in response to the future traffic signal obstruction.
In some exemplary embodiments, the system 500 may be an advanced driver-assistance system including a GPS 550 configured for receiving a location data indicative of a host vehicle location. A host vehicle sensor may include a camera and a lidar for capturing an image. The image may be used for detecting a leading vehicle dimension and a first distance between a host vehicle and the leading vehicle. In addition, the system may include a receiver 520 configured to receive a traffic signal data indicative of a traffic signal location and a traffic signal cycle state. A processor 530 may be configured to determine a traffic signal view obstruction between the host vehicle and a traffic signal in response to the host vehicle location, the first distance, the traffic signal location and the leading vehicle dimension. The system 500 may further include a display 540 configured to visually present a representation of the traffic signal to a vehicle operator in response to the determination of the traffic signal view obstruction and the traffic signal cycle state.
The exemplary system may further include a vehicle controller 580 for detecting a host vehicle speed. The processor 530 may be further configured to determine a future traffic signal view obstruction in response to the host vehicle speed, the leading vehicle dimension, the first distance, and the traffic signal location and for generating a driver alert in response to the future traffic signal obstruction.
Turning now to FIG. 6 , a flow chart illustrating another exemplary method 800 for implementing a traffic light countdown notification and alert suppression system in a motor vehicle according to an exemplary embodiment of the present disclosure is shown. The method is first operative for capturing 610, by a host vehicle camera, an image of a leading vehicle. In some exemplary embodiments, the image may be a depth map captured by a lidar system.
The method is next operative for determining 620 a host vehicle location in response to a location data received via a GPS or global navigation satellite system (GNSS). The host vehicle location may be indicated by the received data and may be confirmed in response to map data stored within a memory and correlation with landmarks or other indicators proximate to the host vehicle and detected with other host vehicle sensors.
The method next estimates 630 a leading vehicle dimension and a first distance between a host vehicle and the leading vehicle in response to the image. Image processing techniques in addition to multiple images captured at different time intervals may be used to determine a leading vehicle dimension, such as height or width. In addition, the distance from the host vehicle to the leading vehicle may be estimated using multiple images which may vary in time and or space. Alternatively, the leading vehicle dimension and the first distance between a host vehicle and the leading vehicle may be determined in response to a depth map generated by a host vehicle lidar system.
The system may then receive 640 a data indicative of a traffic signal location and a traffic signal cycle state, the data indicative of the traffic signal location is stored in a memory in the host vehicle. The data may be received from the traffic signal controller or the like via a V2I communications channel. The V2I communications channel may be a wireless radio frequency channel. The data indicative of the traffic signal location and the traffic signal cycle state may be received via a SPaT data message received by a SPaT data receiver.
The method next determines 650 a second distance between the host vehicle and the traffic signal. The location of the host vehicle may be determined in response to the GPS or GNSS data. The location of the traffic signal may be determined in response to map data or may be determined in response to sensor data such as a camera image or a lidar depth map.
The method is next configured for determining 660 a traffic signal view obstruction in response to the leading vehicle dimension, the first distance, and the second distance. The traffic signal view obstruction may be estimated in response to a portion of the leading vehicle being within a line of sight between the host vehicle and the traffic signal. The obstruction may be determined using geometric techniques to determine an intersection between the height and or width of the leading vehicle in a line of sight from the vehicle operator to the traffic signal. In some exemplary embodiments, the method may further determine a future traffic signal view obstruction in response to a host vehicle speed, the leading vehicle dimension, the first distance, and the second distance and for generating a driver alert in response to the future traffic signal obstruction. If a traffic signal view obstruction is determined, the method next generates 670 a graphical user interface indicative of the traffic signal cycle state in response to the traffic signal view obstruction. The graphical user interface may include a graphical representation of the traffic signal including the current traffic signal cycle state, such as red or green, as well as a countdown timer indicative of a remaining time in the current traffic signal cycle state.
The method is then configured for displaying 680 the graphical user interface to a vehicle operator via a heads up display. The graphical user interface may be an augmented reality representation of a traffic signal indicative of the traffic signal cycle state and is presented to a vehicle operator on an augmented reality heads up display. Alternatively, the graphical user interface may be a display screen within the vehicle cabin viewable by the vehicle operator, such as a center stack display, instrument cluster display, dashboard display or the like.
While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the disclosure in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing the exemplary embodiment or exemplary embodiments. It should be understood that various changes can be made in the function and arrangement of elements without departing from the scope of the disclosure as set forth in the appended claims and the legal equivalents thereof.
Claims (20)
1. An apparatus comprising:
a camera for capturing an image wherein the image includes a representation of a leading vehicle;
a receiver configured to receive a traffic signal location and a traffic signal cycle state;
a processor configured to estimate a leading vehicle height and a leading vehicle width, a first distance between a host vehicle and the leading vehicle in response to the image, a second distance between the host vehicle and a traffic signal in response to the traffic signal location, estimating a traffic signal view obstruction in response to the leading vehicle height and the leading vehicle width, the first distance, and the second distance, and generating a graphical user interface in response to the traffic signal view obstruction;
a vehicle controller for reducing a host vehicle speed in response to the traffic signal view obstruction to ensure that at least a threshold distance is maintained between the host vehicle and the leading vehicle such that the traffic signal remains visible; and
a display configured to display the graphical user interface wherein the graphical user interface is indicative of the traffic signal cycle state.
2. The apparatus of claim 1 , wherein the graphical user interface is further indicative of a time to state change of the traffic signal.
3. The apparatus of claim 1 , further including a memory for storing a map data indicative of the traffic signal location.
4. The apparatus of claim 1 , wherein the display is a heads up display.
5. The apparatus of claim 1 , wherein the traffic signal view obstruction is estimated in response to the leading vehicle height exceeding a threshold value and the first distance being greater than a minimum distance.
6. The apparatus of claim 1 , further including a lidar configured to capture a depth map indicative of the leading vehicle height and the leading vehicle width and the first distance.
7. The apparatus of claim 1 , wherein the graphical user interface is at least one of a center stack display and an instrument cluster display.
8. The apparatus of claim 1 , wherein the processor is further configured to determine a future traffic signal view obstruction in response to the host vehicle speed, the leading vehicle height and the leading vehicle width, the first distance, and the second distance and for generating a driver alert in response to the future traffic signal view obstruction indicative of a time to the future traffic signal view obstruction.
9. The apparatus of claim 1 wherein the graphical user interface is an augmented reality representation of the traffic signal indicative of the traffic signal cycle state and wherein the display is an augmented reality heads up display.
10. A method comprising:
capturing, by a host vehicle camera, an image of a leading vehicle;
estimating a leading vehicle height and a leading vehicle width and a first distance between a host vehicle and the leading vehicle in response to the image;
receiving a data indicative of a traffic signal location and a traffic signal cycle state;
determining a second distance between the host vehicle and a traffic signal;
estimating a traffic signal view obstruction in response to the leading vehicle height and the leading vehicle width, the first distance, and the second distance;
reducing a host vehicle speed in response to the traffic signal view obstruction to ensure that at least a threshold distance is maintained between the host vehicle and the leading vehicle such that the traffic signal remains visible; and
generating a graphical user interface indicative of the traffic signal cycle state in response to the traffic signal view obstruction.
11. The method of claim 10 , further including determining a host vehicle location in response to a location data received via a global positioning system.
12. The method of claim 10 , further including displaying the graphical user interface to a vehicle operator via a heads up display.
13. The method of claim 10 , wherein the data indicative of the traffic signal location is stored in a memory in the host vehicle.
14. The method of claim 10 , wherein the traffic signal view obstruction is estimated in response to a portion of the leading vehicle being within a line of sight between the host vehicle and the traffic signal.
15. The method of claim 10 , wherein the leading vehicle height and the leading vehicle width and the first distance between the host vehicle and the leading vehicle are determined in response to a depth map generated by a host vehicle lidar system.
16. The method of claim 10 , wherein the data indicative of the traffic signal location and the traffic signal cycle state is received via a SPaT data message received by a SPaT data receiver.
17. The method of claim 10 , further including determining a future traffic signal view obstruction in response to the host vehicle speed, the leading vehicle height and the leading vehicle width, the first distance, and the second distance and for generating a driver alert in response to the future traffic signal view obstruction.
18. The method of claim 10 wherein the graphical user interface is an augmented reality representation of the traffic signal indicative of the traffic signal cycle state and is presented to a vehicle operator on an augmented reality heads up display.
19. An advanced driver-assistance system comprising:
a global positioning system configured for receiving a location data indicative of a host vehicle location;
a host vehicle sensor for detecting a leading vehicle height and a leading vehicle width and a first distance between a host vehicle and a leading vehicle;
a receiver configured to receive a traffic signal data indicative of a traffic signal location and a traffic signal cycle state;
a processor configured to determine a traffic signal view obstruction between the host vehicle and a traffic signal in response to the host vehicle location, the first distance, the traffic signal location, the leading vehicle height and the leading vehicle width;
a vehicle controller for reducing a host vehicle speed in response to the traffic signal view obstruction to ensure that at least a threshold distance is maintained between the host vehicle and the leading vehicle such that the traffic signal remains visible; and
a display configured to display a representation of the traffic signal in response to the traffic signal view obstruction and the traffic signal cycle state.
20. The advanced driver-assistance system of claim 19 , wherein the processor is further configured to determine a future traffic signal view obstruction in response to the host vehicle speed, the leading vehicle height and the leading vehicle width, the first distance, and the traffic signal location and for generating a driver alert in response to the future traffic signal view obstruction.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202111268575.8 | 2021-10-29 | ||
| CN202111268575.8A CN116071945A (en) | 2021-10-29 | 2021-10-29 | Traffic light visibility detection and enhanced display |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20230133131A1 US20230133131A1 (en) | 2023-05-04 |
| US11715373B2 true US11715373B2 (en) | 2023-08-01 |
Family
ID=85983650
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/456,723 Active US11715373B2 (en) | 2021-10-29 | 2021-11-29 | Traffic light visibility detection and augmented display |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US11715373B2 (en) |
| CN (1) | CN116071945A (en) |
| DE (1) | DE102022119199A1 (en) |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102021005311A1 (en) * | 2021-10-26 | 2023-04-27 | Mercedes-Benz Group AG | Method for automatic control of a longitudinal movement of a vehicle |
| CN116978246A (en) * | 2023-07-14 | 2023-10-31 | 长城汽车股份有限公司 | Vehicle prompting method, device, vehicle and readable storage medium |
| DE102023207727A1 (en) | 2023-08-10 | 2025-02-13 | Volkswagen Aktiengesellschaft | Vehicle network and method for controlling vehicle functions |
| US20260008461A1 (en) * | 2024-07-08 | 2026-01-08 | Toyota Motor Engineering & Manufacturing North America, Inc. | Mitigating occlusions with sensor sharing/cooperative perception |
| CN118781807A (en) * | 2024-07-09 | 2024-10-15 | 北京高德云图科技有限公司 | Traffic information processing method, device and electronic equipment |
Citations (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20050280553A1 (en) * | 2004-06-16 | 2005-12-22 | Dipiazza Gerald C | Wireless traffic control system |
| US20090063030A1 (en) * | 2007-08-31 | 2009-03-05 | Embarq Holdings Company, Llc | System and method for traffic condition detection |
| US20120013713A1 (en) * | 2009-03-31 | 2012-01-19 | Hironori Sumitomo | Image integration unit and image integration method |
| US20140019005A1 (en) * | 2012-07-10 | 2014-01-16 | Samsung Electronics Co., Ltd. | Transparent display apparatus for displaying information of danger element, and method thereof |
| US20150321606A1 (en) * | 2014-05-09 | 2015-11-12 | HJ Laboratories, LLC | Adaptive conveyance operating system |
| US20180012088A1 (en) * | 2014-07-08 | 2018-01-11 | Nissan Motor Co., Ltd. | Traffic Light Detection Device and Traffic Light Detection Method |
| US20180286233A1 (en) * | 2017-03-31 | 2018-10-04 | Panasonic Intellectual Property Management Co., Ltd. | Driving assistance device, driving assistance method, and non-transitory storage medium |
| US20200081448A1 (en) * | 2018-09-07 | 2020-03-12 | GM Global Technology Operations LLC | Traffic light occlusion detection for autonomous vehicle |
| US20200158530A1 (en) * | 2018-11-19 | 2020-05-21 | Here Global B.V. | Navigation using dynamic intersection map data |
| US20200174261A1 (en) * | 2018-11-30 | 2020-06-04 | International Business Machines Corporation | In-vehicle content display apparatus |
| US20200250443A1 (en) * | 2019-02-01 | 2020-08-06 | Toyota Jidosha Kabushiki Kaisha | Information processing device, server, and traffic management system |
| US20200361482A1 (en) * | 2016-05-30 | 2020-11-19 | Lg Electronics Inc. | Vehicle display device and vehicle |
| US20210097314A1 (en) * | 2019-09-27 | 2021-04-01 | The Travelers Indemnity Company | Systems and methods for artificial intelligence (ai) driving analysis and incentives |
| US20210174673A1 (en) * | 2019-12-10 | 2021-06-10 | Honda Motor Co., Ltd. | Autonomous driving vehicle information presentation apparatus |
| US20210278539A1 (en) * | 2020-03-05 | 2021-09-09 | Uatc, Llc | Systems and Methods for Object Detection and Motion Prediction by Fusing Multiple Sensor Sweeps into a Range View Representation |
| US20210303886A1 (en) * | 2020-03-31 | 2021-09-30 | Robert Bosch Gmbh | Semantically-consistent augmented training data for traffic light detection |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102015015676A1 (en) * | 2015-12-03 | 2016-07-21 | Daimler Ag | A method for automatically connecting an environmental image on a display unit of a vehicle |
| CN109637171A (en) * | 2018-12-03 | 2019-04-16 | 北京盘古影艺文化传播有限公司 | A kind of the variation based reminding method and system of traffic lights |
| CN111754794B (en) * | 2020-05-14 | 2021-11-23 | 深圳市奥拓电子股份有限公司 | Highway traffic information display method, device and system based on intelligent lamp pole |
-
2021
- 2021-10-29 CN CN202111268575.8A patent/CN116071945A/en active Pending
- 2021-11-29 US US17/456,723 patent/US11715373B2/en active Active
-
2022
- 2022-08-01 DE DE102022119199.0A patent/DE102022119199A1/en active Pending
Patent Citations (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20050280553A1 (en) * | 2004-06-16 | 2005-12-22 | Dipiazza Gerald C | Wireless traffic control system |
| US20090063030A1 (en) * | 2007-08-31 | 2009-03-05 | Embarq Holdings Company, Llc | System and method for traffic condition detection |
| US20120013713A1 (en) * | 2009-03-31 | 2012-01-19 | Hironori Sumitomo | Image integration unit and image integration method |
| US20140019005A1 (en) * | 2012-07-10 | 2014-01-16 | Samsung Electronics Co., Ltd. | Transparent display apparatus for displaying information of danger element, and method thereof |
| US20150321606A1 (en) * | 2014-05-09 | 2015-11-12 | HJ Laboratories, LLC | Adaptive conveyance operating system |
| US20180012088A1 (en) * | 2014-07-08 | 2018-01-11 | Nissan Motor Co., Ltd. | Traffic Light Detection Device and Traffic Light Detection Method |
| US20200361482A1 (en) * | 2016-05-30 | 2020-11-19 | Lg Electronics Inc. | Vehicle display device and vehicle |
| US20180286233A1 (en) * | 2017-03-31 | 2018-10-04 | Panasonic Intellectual Property Management Co., Ltd. | Driving assistance device, driving assistance method, and non-transitory storage medium |
| US20200081448A1 (en) * | 2018-09-07 | 2020-03-12 | GM Global Technology Operations LLC | Traffic light occlusion detection for autonomous vehicle |
| US20200158530A1 (en) * | 2018-11-19 | 2020-05-21 | Here Global B.V. | Navigation using dynamic intersection map data |
| US20200174261A1 (en) * | 2018-11-30 | 2020-06-04 | International Business Machines Corporation | In-vehicle content display apparatus |
| US20200250443A1 (en) * | 2019-02-01 | 2020-08-06 | Toyota Jidosha Kabushiki Kaisha | Information processing device, server, and traffic management system |
| US20210097314A1 (en) * | 2019-09-27 | 2021-04-01 | The Travelers Indemnity Company | Systems and methods for artificial intelligence (ai) driving analysis and incentives |
| US20210174673A1 (en) * | 2019-12-10 | 2021-06-10 | Honda Motor Co., Ltd. | Autonomous driving vehicle information presentation apparatus |
| US20210278539A1 (en) * | 2020-03-05 | 2021-09-09 | Uatc, Llc | Systems and Methods for Object Detection and Motion Prediction by Fusing Multiple Sensor Sweeps into a Range View Representation |
| US20210303886A1 (en) * | 2020-03-31 | 2021-09-30 | Robert Bosch Gmbh | Semantically-consistent augmented training data for traffic light detection |
Also Published As
| Publication number | Publication date |
|---|---|
| US20230133131A1 (en) | 2023-05-04 |
| CN116071945A (en) | 2023-05-05 |
| DE102022119199A1 (en) | 2023-05-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11715373B2 (en) | Traffic light visibility detection and augmented display | |
| US10169997B2 (en) | Vehicle alert apparatus | |
| JP6326004B2 (en) | Other vehicle position detector | |
| US7205888B2 (en) | Driving assisting apparatus for preventing vehicular collision | |
| EP3293488A2 (en) | System and method of simulataneously generating a multiple lane map and localizing a vehicle in the generated map | |
| EP3226022B1 (en) | Notification device of an approaching vehicle | |
| US10262629B2 (en) | Display device | |
| US20190073540A1 (en) | Vehicle control device, vehicle control method, and storage medium | |
| US11361687B2 (en) | Advertisement display device, vehicle, and advertisement display method | |
| EP3486133B1 (en) | Travel control method and travel control device | |
| CN106415693A (en) | Cognitive notification device for vehicles, Cognitive notification system for vehicles | |
| JP6500724B2 (en) | Danger information notification system, server and computer program | |
| JP2008210051A (en) | Driving support system for vehicle | |
| US20200341111A1 (en) | Method and apparatus for radar detection confirmation | |
| JP2017003395A (en) | Vehicle positioning system | |
| JP5104372B2 (en) | Inter-vehicle communication system, inter-vehicle communication device | |
| US20220319317A1 (en) | Driving assist apparatus | |
| JP2007257303A (en) | Traffic light recognition device | |
| EP3207493A1 (en) | Systems and methods for traffic sign assistance | |
| JP2019121274A (en) | Notification device and on-vehicle device | |
| US12384294B2 (en) | Driving support device, driving support method, and driving support program for notifying a driver of a vehicle of a target object approaching the vehicle | |
| US11670167B2 (en) | Traffic light countdown notification and alert suppression | |
| JP6476921B2 (en) | Dangerous vehicle detection system and in-vehicle information processing apparatus | |
| JP2018106355A (en) | Driving assistance device | |
| US20250074194A1 (en) | Display device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: GM GLOBAL TECHNOLOGY OPERATIONS LLC, MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NASERIAN, MOHAMMAD;CONG, FEIYU;ZHU, JIE;AND OTHERS;SIGNING DATES FROM 20210915 TO 20210916;REEL/FRAME:058227/0712 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |