SE542785C2 - Method and control arrangement for controlling an adas - Google Patents

Method and control arrangement for controlling an adas

Info

Publication number
SE542785C2
SE542785C2 SE1850843A SE1850843A SE542785C2 SE 542785 C2 SE542785 C2 SE 542785C2 SE 1850843 A SE1850843 A SE 1850843A SE 1850843 A SE1850843 A SE 1850843A SE 542785 C2 SE542785 C2 SE 542785C2
Authority
SE
Sweden
Prior art keywords
vehicle
intersection
information
driver
driving direction
Prior art date
Application number
SE1850843A
Other versions
SE1850843A1 (en
Inventor
Andreas Höglund
Jimmy Nichols
Jonny Johansson
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE1850843A priority Critical patent/SE542785C2/en
Priority to DE102019004481.9A priority patent/DE102019004481A1/en
Publication of SE1850843A1 publication Critical patent/SE1850843A1/en
Publication of SE542785C2 publication Critical patent/SE542785C2/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18154Approaching an intersection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18159Traversing an intersection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0097Predicting future conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/095Traffic lights
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/09623Systems involving the acquisition of information from passive traffic signs by means mounted on the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/60Traffic rules, e.g. speed limits or right of way

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

Method (500) and control arrangement (300) for controlling an ADAS (150) of a vehicle (100), which ADAS (150) is configured to monitor a vehicle environment in a future driving direction of the vehicle (100). The method (500) comprises determining (501) that the vehicle (100) is approaching an intersection (400), regulated by at least one traffic light (120, 220, 240); predicting (502), before the vehicle (100) is passing the intersection (400), a future driving direction of the vehicle (100) after the intersection (400); identifying (503) the traffic light (120) regulating the traffic in the predicted (502) future driving direction; receiving (504) information from a transmitter (130), associated with the identified (503) traffic light (120); and activating (505) at least a human machine interface (310) of the ADAS (150) of the vehicle (100), based on the received (504) information.

Description

METHOD AND CONTROL ARRANGEMENT FOR CONTROLLING AN ADAS TECHNICAL FIELD This document discloses a method and a control arrangement. More particularly, a method and a control arrangement are described, for controlling an advanced driver-assistance system of a vehicle, which advanced driver-assistance system is configured to monitor a vehicle environment in a future driving direction of the vehicle.
BACKGROUND In vehicles, perhaps in particular heavy vehicles such as trucks with/ without trailers, busses, vehicle combinations, road trains etc., there are several blind spots around the vehicle, making it difficult for the driver to detect other road users around the vehicle.
Another limitation of visibility of the vehicle driver may occur in e.g. darkness, cloudy sky in combination with heavy rain, hail, snow, blizzard, fog, smoke, pollution, or similar weather conditions.
In such reduced visibility situations, the driver may not see relevant information such as obstacles, vulnerable road users, traffic signs, road signs etc.
A known solution to these problems is to provide an Advanced Driver-Assistance System (ADAS), assisting the driver in detecting various dangers in the traffic via sensors around the vehicle and output an alert via a human machine interface such as a display, a loudspeaker, etc. Thereby, the driver’s attention may be caught towards the detected potentially dangerous situation.
However, unnecessary/ redundant activation of in-vehicle safety system lowers the driver’s confidence in the system and might also cause high cognitive workload for the driver. This in turn can lead to the driver being distracted and cause accidents. Another driver reaction may be to disable the ADAS, which of course also may present a traffic danger.
Document DE102016012054 shows a method of increasing pedestrian safety at intersections by getting the crossing vehicle to receive information about the different phases of traffic lights, via own cameras and / or via traffic light phases from an external server, and detecting the presence of a pedestrian around the road.
The problem with the solution provided in the document is that the driver may become overwhelmed by information, which may erode the driver’s confidence of the system, and distract the driver from real dangers in the traffic.
Document US2017032197 shows a system that activates cameras on the vehicle to control the status of traffic lights and detect obstacles on the road in front of the vehicle. The vehicle's automatic brake control is activated for braking if necessary.
A problem with the disclosed solution is that a variety of information is continuously delivered to the driver, which may distract him/ her.
Document US20160318490 shows a method in which the phase of a traffic light is detected by a camera and if the system considers that it is necessary, a warning is issued to the driver or, for example, safety measures are activated such as acceleration, deceleration, turn, etc.
Detection of the traffic light in front of the vehicle is in general not presenting any major traffic danger. The document does not present any solution to the problem of assisting the driver in discovering potential problems or dangers which are not really visible.
Document DE102015210146 shows a method of preventing collisions between two vehicles at a traffic light where detection of the traffic light phase (via transmitted information from the traffic signals) occurs and depending on how the phases are about to change, a safety system is activated e.g. in the form of braking of any of the vehicle.
Again, the solution is more concerned with detecting obstacles which anyway are visible to the driver, rather than helping him/ her to observe obstacles or other road users which are not visible to the user, i.e. situated in a blind spot of the driver.
It would be desired to avoid alerting the driver in non-dangerous situations, for example when the driver already is aware of the potential danger, while still being able to alert the driver when a substantial danger, and / or danger unknown to the driver appear.
SUMMARY It is therefore an object of this invention to solve at least some of the above problems and improve traffic safety, in particular when passing an intersection.
According to a first aspect of the invention, this objective is achieved by a method in a vehicle, for controlling an advanced driver-assistance system of a vehicle, which advanced driverassistance system is configured to monitor a vehicle environment in a future driving direction of the vehicle. The method comprises the steps of determining that the vehicle is approaching an intersection, regulated by at least one traffic light. Further, the method also comprises predicting, before the vehicle is passing the intersection, a future driving direction of the vehicle after the intersection. The method in addition comprises identifying the traffic light regulating the traffic in the predicted future driving direction. The method also comprises receiving information from a transmitter, associated with the identified traffic light. In further addition, the method also comprises activating at least a human machine interface of the advanced driver-assistance system of the vehicle, based on the received information.
According to a second aspect of the invention, this objective is achieved by a control arrangement in a vehicle. The control arrangement is configured for controlling an advanced driver-assistance system of a vehicle, which advanced driver-assistance system is configured to monitor a vehicle environment in a future driving direction of the vehicle. The control arrangement is configured to determine that the vehicle is approaching an intersection, regulated by at least one traffic light. Further, the control arrangement is configured to predict, before the vehicle is passing the intersection, a future driving direction of the vehicle after the intersection. The control arrangement is furthermore configured to identify the traffic light regulating the traffic in the predicted future driving direction. Also, the control arrangement is configured to receive information from a transmitter, associated with the identified traffic light. The control arrangement is additionally configured to activate at least a human machine interface of the advanced driver-assistance system of the vehicle, based on the received information.
Thanks to the described aspects, by using information concerning a next phase shift to green of a traffic light, a human machine interface may be activated for outputting information of the advanced driver-assistance system, thereby calling the driver’s attention to an obstacle e.g. in a blind spot of the driver, which may present a potential danger. Thereby, an accident may be avoided. By outputting information when it is relevant for the current traffic situation and the predicted movement of the vehicle, but avoiding outputting irrelevant information otherwise, traffic safety is enhanced, while avoiding that the driver lose interest in the information provided by the advanced driver-assistance system.
Other advantages and additional novel features will become apparent from the subsequent detailed description.
FIGURES Embodiments of the invention will now be described in further detail with reference to the accompanying figures, in which: Figure 1 illustrates an example of a vehicle equipped with an embodiment of the invention, approaching a road intersection; Figure 2 illustrates an example of a vehicle equipped with an embodiment of the invention, approaching a road intersection as regarded from above; Figure 3 illustrates an example of a vehicle equipped with an embodiment of the invention, as regarded from within the vehicle when approaching a road intersection; Figure 4 illustrates an example of a vehicle approaching a road intersection as regarded from above, according to an embodiment; Figure 5 is a flow chart illustrating an embodiment of the method; Figure 6 is an illustration depicting a system according to an embodiment.
DETAILED DESCRIPTION Embodiments of the invention described herein are defined as a method, a control arrangement and a system, which may be put into practice in the embodiments described below. These embodiments may, however, be exemplified and realised in many different forms and are not to be limited to the examples set forth herein; rather, these illustrative examples of embodiments are provided so that this disclosure will be thorough and complete.
Still other objects and features may become apparent from the following detailed description, considered in conjunction with the accompanying drawings. It is to be understood, however, that the drawings are designed solely for purposes of illustration and not as a definition of the limits of the herein disclosed embodiments, for which reference is to be made to the appended claims. Further, the drawings are not necessarily drawn to scale and, unless otherwise indicated, they are merely intended to conceptually illustrate the structures and procedures described herein.
Figure 1 illustrates a scenario with a vehicle 100 driving in a driving direction 105 on a road 110a, approaching a road intersection where a traffic light 120 is situated to regulate the traffic.
The vehicle 100 may comprise a means for transportation in broad sense such as e.g. a truck, a car, a motorcycle, a vehicle train, a combination vehicle, a trailer, a bus, a bike, a train, a tram, an aircraft, a watercraft, an unmanned underwater vehicle/ underwater drone, a cable transport, an aerial tramway, a drone, a humanoid service robot, a spacecraft, or other similar manned or unmanned means of conveyance running e.g. on wheels, on rails, in air, in/on water, in space, etc.
The traffic light 120 may alternatively also be referred to as traffic signal, traffic lamp, traffic semaphore, signal light, stop light, and / or traffic control signal. The traffic light 120 may control the traffic at an intersection of a road, at a train/ road intersection, a bridge, at a ferry, for entering a parking lot/ house, etc. However, the expression traffic light 120 may in the current context also be understood as a dynamic/ electronic traffic sign, for example, configured to output a dynamic content for controlling the traffic flow.
The vehicle 100 may be driver controlled or driverless (i.e. autonomously controlled) in different embodiments. However, for enhanced clarity, the vehicle 100 is subsequently described as having a driver.
The traffic light 120 may comprise, or be associated with, a transmitter 130. The traffic light 120 may be connected to a central server in some embodiments.
The vehicle 100 may comprise a communication device 140, configured for wireless communication with the transmitter 130 associated with the traffic light 120. Thereby, communication may be made between the traffic light 120 and the vehicle 100, such as for example the remaining time until the traffic light 120 turns from red to green; or how much time remains until the traffic light 120 turns from green to red, for example.
Communication between the transmitter 130 and the communication device 140 may be made over a wireless communication interface, such as e.g. Vehicle-to- Vehicle (V2V) communication, or Vehicle-to-lnfrastructure (V2I) communication. The common term Vehicle-to-Everything (V2X) is sometimes used. The communication may e.g. be based on Dedicated Short-Range Communications (DSRC) devices. DSRC works in 5.9 GHz band with bandwidth of 75 MHz and approximate range of 1000 m in some embodiments.
The wireless communication may be made according to any IEEE standard for wireless vehicular communication like e.g. a special mode of operation of IEEE 802.11 for vehicular networks called Wireless Access in Vehicular Environments (WAVE). IEEE 802.11 p is an extension to 802.11 Wireless LAN medium access layer (MAC) and physical layer (PHY) specification.
Such wireless communication interface may comprise, or at least be inspired by wireless communication technology such as Wi-Fi, Wireless Local Area Network (WLAN), Ultra Mobile Broadband (UMB), Bluetooth (BT), Radio-Frequency Identification (RFID), etc.
The communication may alternatively be made over a wireless interface comprising, or at least being inspired by radio access technologies such as e.g. 3GPP LTE, LTE-Advanced, E-UTRAN, UMTS, GSM, GSM/ EDGE, WCDMA, Time Division Multiple Access (TDMA) networks, or similar, just to mention some few options, via a wireless communication network.
Thereby, information may be provided from the traffic light 120 to the vehicle 100, such as for example current light state (green/ yellow/ red), light timing plans of the traffic light 120, remaining time in green/ red state. Also, extraordinary information such as emergency vehicle pre-emptions and / or variations in timing plans may be communicated. The predictive information may be made available to various applications in the vehicle 100, such as a redlight countdown, cruise control/ “green wave” management, remaining driving time estimation etc., and each application may determine whether the prediction is strong enough for its intended purpose.
Hereby, driver stress may be decreased while traffic safety is enhanced. Also, Fuel efficiency may be increased, and carbon emissions may be decreased by turning off the engine of the vehicle 100 when the remaining time in red state is higher than a threshold limit; and / or by decreasing the vehicle speed in order to arrive at the traffic light 120 as it switches into green.
However, the intersection may have several independent traffic lights 120 that could apply to the vehicle 100. In order to give the driver (if any) correct information, it is also important to know in which direction the vehicle 100 is heading, i.e. going straight ahead or turn left/ right. In many cases it may also be an advantage to be able to consider multiple intersections ahead of the vehicle 100, in the future driving direction of the vehicle 100, in order to give better speed recommendations or provide an improved estimation of time of arrival.
In order to determine which way the vehicle 100 is going to drive, for example by extracting navigation information from a navigator of the vehicle 100; by determining position of the vehicle 100 and determine which driving lane the vehicle 100 is situated in; and / or based on sensor information, detecting e.g. arrows on the driving lane and/ or the traffic light 120, or the driving direction of the vehicle in front, for example.
Thus, the vehicle 100 may comprise a sensor 145 in some embodiments, to detect the future driving direction of the vehicle 100.
The sensor 145 may be forwardly directed in the driving direction 105 of the vehicle 100. In the illustrated embodiment, which is merely an arbitrary example, the forwardly directed sensor 145 may be situated e.g. at the front of the vehicle 100, behind the windscreen of the vehicle 100.
The sensor 145 may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, a radar, a lidar, an ultrasound device, a time-of-flight camera, or similar device, in different embodiments. The sensor 145 may be dedicated to detecting driving lane direction, e.g. by detecting a driving direction arrow painted on the driving lane; however, the sensor 145 may also or alternatively be used for a variety of other tasks.
The sensor 145 comprises or may be connected to a control arrangement configured for image recognition/ computer vision and object recognition. Thereby the process of detecting driving intention of the driver may be automated. The image understanding can be seen as the disentangling of symbolic information from image data using models constructed with the aid of geometry, physics, statistics, and learning theory.
The image data of the sensor 145 may take many forms, such as e.g. images, video sequences, views from multiple cameras, or multi-dimensional data from a scanner.
The driving direction may comprise one or several algorithms that can be used independently or in a combination with each other, depending on the traffic situation. These algorithms can also be used with other algorithms that are not stated here.
Based on the sensor 145 or other device such as high precision positioning (GPS or similar) and a map it may be detected what lane the vehicle 100 is travelling in. The information captured by the camera/ sensor 145 may be compared with information regarding the lanes leading to the traffic light 120 in order to determine what traffic light 120 that the lane is related to. The closer the vehicle 100 gets to the traffic light 120, the higher the probability that the current lane is indicating what direction the vehicle 100 will have when passing the intersection. To increase the confidence in the prediction, the type of lines that separate the driving lanes may be determined and used. If the line between the lanes indicates that it is not allowed to change lane, it is very likely that the direction of the current lane is the direction that the vehicle 100 will have when passing the traffic light 120.
A route that the vehicle 100 intends to follow may be extracted, for example from a navigation function of the vehicle 100, thereby detecting which direction the navigator will recommend. This information may also be verified by comparing the suggested driving direction with the detected driving lane wherein the vehicle 100 is currently situated.
Further, information about the current traffic situation may be obtained, for example local hazard/ road works warnings which are distributed by various traffic information services. It may thereby, by also determining the geographical position of the vehicle 100, be determine whether the vehicle 100 is approaching a road section where at least one of the lanes are partly or completely blocked e.g., by a road work, an accident, etc. In this case, i.e. when the lane which the vehicle 100 currently is determined to be situated within, or a parallel lane, the confidence of the prediction may be considered low, and the direction prediction may be paused until the vehicle 100 has passed the local hazard to see if the vehicle 100 changes lanes or not. The obtained information may also be verified by comparing the suggested direction with the detected lane according to the determination of the vehicle position performed by the navigator.
The geographical position of the vehicle 100 may be determined, and it may be detected using map data of the determined geographical position if any of the lanes has a restricted use, for example a bus lane. This may also be combined with any type of map-data to increase the confidence in the information. If the vehicle 100 is travelling in a restricted lane and is allowed to do so, for example when the vehicle 100 is a bus, taxi, emergency vehicle etc., driving in a bus lane, it may be seen as probable that the vehicle 100 will follow the direction of that lane. If the vehicle 100 is not allowed to travel in the restricted lane, a check may be made to determine if the lane ends before the traffic lights 120. If it does not, it is likely not to be used and can be discharged from the prediction, if it does end, it cannot be discharged as a possible lane to be used.
Further, the direction indicator may be used to determine the intention of the driver of the vehicle 100. All or some of these different aspects may be combined in order to make an improved, i.e. more confident prediction of the intended driving direction.
Further, the different above described methods may be used in any possible combination. Thereby, a more accurate prediction of the direction of the vehicle 100 may be performed, hence it is more likely that traffic lights 120 that are of no relevance to the driver/ vehicle 100 can be filtered out and does not have to be displayed.
Further, the vehicle 100 comprises an Advanced Driver-Assistance System (ADAS) 150.
The advanced driver-assistance system 150 is a system to help the driver in the driving process. When designed with a safe human-machine interface, such as e.g. a display, vehicle safety is increased.
Many road accidents unfortunately are consequences of driver mistakes, like for example not observing another approaching vehicle or obstacle in time. The advanced driver-assistance system 150 is developed to automate, adapt and enhance the vehicle 100 for safety and better driving. The driver is alerted in order to avoid potential problems; alternatively and/or additionally, the advanced driver-assistance system 150 may take control over the vehicle 100 in order to avoid a predicted accident. Some examples of some adaptive features of the advanced driver-assistance system 150, besides the examples already mentioned may be adaptive cruise control, automated braking before a detected obstacle, incorporated navigator/ traffic warnings, lane departure warnings, automatic lane centering, or show what is in blind spots. However, to constantly output this information to the driver may cause a severe cognitive workload on the driver and possibly make him/her immune to alerts from the advanced driver-assistance system 150, due to adaptation to the stream of mainly irrelevant alerts.
A particularly dangerous situation in urban areas are right turns (or left turns in left-hand traffic), as obstacles 160, e.g. vulnerable road users such as bicyclists or pedestrians may cross the path of the vehicle 100. In particular heavy vehicles 100 such as trucks, busses, etc., have limited visibility along the right side of the vehicle 100. Also, the driver must monitor the environmental traffic situation while turning to avoid accidents, making it difficult for the driver to keep the eyes in the most relevant direction all the time. The advanced driver-assistance system 150 is implemented to detect obstacles 160 and present the detected obstacle 160 on a human-machine interface, such as a display.
According to some embodiments, information from the traffic light 120 about the next phase shift to green an in-vehicle safety system, such as a side camera, may be activated a configurable number of seconds before the shift and thereby giving the driver information on potential hazards before making the turn. Also, or alternatively, a human-machine interface may be activated and output information to the driver only when the information is relevant for the traffic safety, depending on the traffic situation.
Figure 2 illustrates a scenario wherein the vehicle 100 is approaching an intersection while driving in the driving direction 105, where different lanes 110a, 110b, 110c are dedicated for different directions after the intersection. In this particular illustrated intersection, one lane 110a is dedicated for right turn, one lane 110b is dedicated for driving straight ahead, while one driving lane 110c is dedicated for turning left.
The driving lanes 110a, 110b, 110c may be divided by road markings 250a, 250b. One first road marking 250a between two of the lanes 110a, 110b while one second road marking 250b is situated the two other lanes 110b, 110c.
In this intersection, one traffic light 120 may be dedicated for the driving lane 110a for right turn, one other traffic light 220 may be dedicated for the driving lane 110b for driving straight ahead after the intersection, and one traffic light 240 may be dedicated for the driving lane 110c for turning left. These traffic lights 120, 220, 240 may be uncorrelated between each other in some embodiments, i.e. they may switch between different phases, such as typically red, amber/ yellow, and / or green. In other embodiments, at least some of the traffic lights 120, 220, 240 may be correlated with each other. Also, in some embodiments, one of the driving lanes 110a, 110b, 110c may be dedicated for a particular type of traffic or vehicle type, such as busses, taxis, emergency vehicles, etc. The traffic light 120, 220, 240 controlling the traffic of that driving lane 110a, 110b, 110c may have a different design, for make them easily distinguishable from those for normal/ private traffic. The traffic light 120, 220, 240 may comprise letters, text, arrows or bars of white or coloured light.
Further, the traffic lights 120, 220, 240 may comprise or be associated with one or several transmitters/ transceivers 130, 210, 230. Thus, several traffic lights 120, 220, 240 may be associated with one, or several, transmitters/ transceivers 130, 210, 230 for communication with the close by vehicles 100.
In some embodiments, the traffic lights 120, 220, 240 may be connected to a central server via a wired or wireless connection. Thereby, information may be coordinated and / or communicated with the central server.
Information concerning the remaining time in the green period of the traffic lights 120, 220, 240, alternatively the remaining time in the red period before the traffic light 120, 220, 240 switches into green may be communicated to the close-by vehicles 100.
However, it is important to determine which traffic light 120, 220, 240 that is relevant for the vehicle 100.
In some embodiments, the geographical position of the vehicle 100 may be determined and by using a detailed map of the road, it may be determined which driving lane 110a, 110b, 110c the vehicle 100 is situated in. It may be presumed that the vehicle 100 is going to drive in the direction associated with the driving lane 110a, 110b, 110c in which the vehicle 100 currently is situated. The probability of the driving direction may be increased the shorter the distance is to the traffic light 120, 220, 240.
In some embodiments, or in combination with the determination of the geographical position, a sensor 145 of the vehicle 100, such as e.g. a camera, may detect a sign on the driving lane 110a, 110b, 110c, typically an arrow, which information may be used for predicting the future driving direction of the vehicle 100. Also, or alternatively, the sensor 145 may detect driving direction indications from road signs, or from an ahead vehicle for example.
Further, the direction indicator of the vehicle 100 may be used to determine the intention of the driver of the vehicle 100.
Figure 3 illustrates an example of how the previously scenario in Figure 1, and / or possibly Figure 2 may be perceived by a driver (if any) of the vehicle 100, approaching an intersection and a traffic light 120.
The traffic light 120 may be detected by the sensor 145 in the vehicle 100 in some embodiments. Further, or alternatively, the traffic light 120 may be detected by receiving wireless signals emitted by a transceiver 130 associated with the traffic light 120, received by the receiver 140 in the vehicle 100.
Further, the vehicle 100 comprises a control arrangement 300, configured for assisting the driver of the vehicle 100 in obtaining information of the advanced driver-assistance system 150 when it is considered relevant for the traffic safety. When it has been determined that the vehicle 100 is likely to turn right, it may be dangerous in case an obstacle 160 is situated on a blind spot on the right side of the vehicle 100. The information or alert may be outputted on a human-machine interface 310. The human-machine interface 310 may comprise a display, a loudspeaker, a tactile device, etc. In the illustrated example in Figure 3, a bicycle 160 is detected by a sensor of the advanced driver-assistance system 150 and an image thereof is outputted on the human-machine interface 310, possibly in combination with a sounding alert, a haptic signal or tactile feedback in the steering wheel, driver seat or similar.
The geographical position of the vehicle 100 may be determined by a positioning unit 320 in the vehicle 100, which may be based on a satellite navigation system such as the Navigation Signal Timing and Ranging (Navstar) Global Positioning System (GPS), Differential GPS (DGPS), Galileo, GLONASS, or the like.
The geographical position of the positioning unit 320, (and thereby also of the vehicle 100) may be made continuously with a certain predetermined or configurable time intervals according to various embodiments.
Positioning by satellite navigation is based on distance measurement using triangulation from a number of satellites 330a, 330b, 330c, 330d. In this example, four satellites 330a, 330b, 330c, 330d are depicted, but this is merely an example. More than four satellites 330a, 330b, 330c, 330d may be used for enhancing the precision, or for creating redundancy. The satellites 330a, 330b, 330c, 330d continuously transmit information about time and date (for example, in coded form), identity (which satellite 330a, 330b, 330c, 330d that broadcasts), status, and where the satellite 330a, 330b, 330c, 330d are situated at any given time. The GPS satellites 330a, 330b, 330c, 330d sends information encoded with different codes, for example, but not necessarily based on Code Division Multiple Access (CDMA). This allows information from an individual satellite 330a, 330b, 330c, 330d distinguished from the others' information, based on a unique code for each respective satellite 330a, 330b, 330c, 330d. This information can then be transmitted to be received by the appropriately adapted positioning device comprised in the vehicle 100.
Distance measurement can according to some embodiments comprise measuring the difference in the time it takes for each respective satellite signal transmitted by the respective satellites 330a, 330b, 330c, 330d to reach the positioning unit 320. As the radio signals travel at the speed of light, the distance to the respective satellite 330a, 330b, 330c, 330d may be computed by measuring the signal propagation time.
The positions of the satellites 330a, 330b, 330c, 330d are known, as they continuously are monitored by approximately 15-30 ground stations located mainly along and near the earth's equator. Thereby the geographical position, i.e. latitude and longitude, of the vehicle 100 may be calculated by determining the distance to at least three satellites 330a, 330b, 330c, 330d through triangulation. For determination of altitude, signals from four satellites 330a, 330b, 330c, 330d may be used according to some embodiments.
Having determined the geographical position of the positioning unit 320 (or in another way), the relevant driving lane 110a, 110b, 110c of the vehicle 100 may be determined when applying the determined geographical location of the vehicle 100 to a detailed map of the area, which may be comprised in a database 340 in the vehicle 100, or possibly outside the vehicle 100.
However, the position of the vehicle 100 may alternatively, or additionally be determined e.g. by having transponders positioned at known positions around the route and a dedicated sensor in the vehicle 100, for recognising the transponders and thereby determining the position; by detecting and recognising WiFi networks (WiFi networks along the route may be mapped with certain respective geographical positions in a database); by receiving a Bluetooth beaconing signal, associated with a geographical position, or other signal signatures of wireless signals such as e.g. by triangulation of signals emitted by a plurality of fixed base stations with known geographical positions.
The vehicle 100 may further comprise a navigator, in which the driver or other person may register the destination of the vehicle 100. In some embodiments, it may be presumed that the vehicle 100 is going to turn in the direction leading towards the destination of the navigator when arriving at the intersection. Further, or alternatively, this information may be used in combination with other indications, e.g. driving lane 110a, 110b, 110c determination, for making a better prediction.
Thereby, it may be detected that it is likely that the vehicle 100 is going to turn right and then activate either the human-machine interface 310, or alternatively the whole advanced driverassistance system 150. It is thereby ascertained that the driver only is provided with relevant/ critical information from the advanced driver-assistance system 150, thereby enhancing the driver’s reliability to the system 150.
Figure 4 illustrates an intersection 400 as regarded from above. The vehicle 100 is driving in a driving direction 105 on a driving lane 110 and is approaching the intersection 400 and the traffic light 120, regulating the traffic in at least one direction after the intersection. The intersection 400 may for example be a road junction, a roundabout, or similar.
In the illustrated scenario, the vehicle 100 is going to turn right after the intersection. Right after the crossing, a pedestrian may cross the road on a pedestrian crossing. However, the crossing may be entirely or partly concealed, e.g. by a parked vehicle, etc.
In some embodiments, a vehicle external sensor 410 may be arranged to survey the pedestrian crossing and send information via a wireless transceiver 420. The information may be obtained by the vehicle 100 via the receiver 140 and outputted to the driver of the vehicle 100 on the human-machine interface 310.
When no pedestrian is detected on or close by the pedestrian crossing, no information may be outputted in some embodiments.
An advantage with using vehicle external sensors 410 for capturing relevant information, provide it to the vehicle 100 and output it, when considered relevant, to the driver of the vehicle 100 is that relevant information may be outputted to the driver also when the situation is not possible to observe from the vehicle 100, or detect by any sensor 145 on the vehicle 100, etc., thereby giving the driver relevant information.
The vehicle external sensors 410 may thereby assist the driver in observing a relevant traffic related object on, or in vicinity of a road 110, e.g. in a reduced visibility situation, in twilight, at night, in rain, in fog, in snow or similar weather conditions; and / or when an obstacle is blocking the sight of the driver/ sensor on the vehicle 145. Thereby traffic safety is enhanced.
Figure 5 illustrates an example of a method 500 according to an embodiment. The flow chart in Figure 5 shows the method 500 for use in a vehicle 100 for controlling an advanced driverassistance system 150 of the vehicle 100, which advanced driver-assistance system 150 is configured to monitor a vehicle environment in a future driving direction of the vehicle 100.
In order to correctly be able to control the advanced driver-assistance system 150, the method 500 may comprise a number of steps 501-509. However, some of these steps 501 -509 may be performed solely in some alternative embodiments, like e.g. steps 506-509. Further, the described steps 501-509 may be performed in a somewhat different chronological order than the numbering suggests. The method 500 may comprise the subsequent steps: Step 501 comprises determining that the vehicle 100 is approaching an intersection 400, regulated by at least one traffic light 120, 220, 240.
It may be determined that the vehicle 100 is approaching the intersection 400 based on input from a sensor 145 of the vehicle 100. The sensor 145 may comprise e.g. a front camera in the vehicle 100, which together with image interpreting logic in the control arrangement 300 may detect the traffic light 120, 220, 240 and/ or other objects.
In some embodiments, it may be determined that the vehicle 100 is approaching the intersection 400 based on input from a positioning unit 320, configured to determine position of the positioning unit 320 based on signals received from satellites 330a, 330b, 330c, 330d of a satellite navigation system.
Furthermore, these embodiments may be combined in some embodiments. It may then be determined that the vehicle 100 is approaching the intersection 400 based on input from both one or several sensors 145 in the vehicle 100, and input from the positioning unit 320.
Step 502 comprises predicting, before the vehicle 100 is passing the intersection 400, a future driving direction of the vehicle 100 after the intersection 400.
The future driving direction of the vehicle 100 after the intersection 400 may be predicted based on input from the sensor 145 of the vehicle 100.
In some embodiments, the future driving direction of the vehicle 100 after the intersection 400 may be predicted based on input from the positioning unit 320, configured to determine position of the positioning unit 320 based on signals received from satellites 330a, 330b, 330c, 330d of the satellite navigation system.
Furthermore, these embodiments may be combined in some embodiments. The future driving direction of the vehicle 100 after the intersection 400 may then be predicted based on input from both one or several sensors 145 in the vehicle 100, and input from the positioning unit 320. Thereby, a more reliable prediction may be made in some embodiments.
A confidence of the prediction 502 of the future driving direction of the vehicle 100 after the intersection 400 may be considered more confident the closer the vehicle 100 is to the traffic light 120.
The prediction 502 of the future driving direction of the vehicle 100 after the intersection 400 may be considered confident when the line 250a, 250b between the lanes 110a, 110b, 110c indicates that lane change is prohibited.
The prediction 502 of the future driving direction of the vehicle 100 after the intersection 400 may be based on obtained information concerning restricted use of the current driving lane, e.g. due to an obstacle, road work or accident. The prediction 502 may also be considered less confident when the obtained information indicates that the current driving lane is disallowed for the vehicle 100, e.g. when the driving lane is dedicated for busses only.
Step 503 comprises identifying the traffic light 120 regulating the traffic in the predicted 502 future driving direction of the vehicle 100.
The identification of the traffic light 120 may be made in one out of several possible ways, such as for example by identifying the traffic light 120 with a forwardly directed sensor 145 on the vehicle 100, by extracting an identification of the traffic light 120 from a database, by receiving an identity signal from the traffic light 120, etc.
Step 504 comprises receiving information from a transmitter 130, associated with the identified 503 traffic light 120.
The transmitter 130 may be integrated with the traffic light 120 in some embodiments. However, the transmitter 130 may be external to the traffic light 120 and communicating therewith over a wired or wireless communication interface.
Step 505 comprises activating at least one human machine interface 310, i.e. an auditive signal, a voice message, a tactile signal, a message on the display etc., of the advanced driver-assistance system 150 of the vehicle 100, based on the received 504 information.
The human machine interface 310 may be activated a time period before the traffic light 120 is going to make a phase shift between green/ yellow/ red in some embodiments.
The driver<'>s attention is directed towards the potentially dangerous traffic situation. Thereby, a possible accident may be omitted, while avoiding that the driver is overwhelmed with redundant information.
In some embodiments, escalated warnings may be output when the driver of the vehicle 100 does not react on the outputted information concerning the obstacle 160, e.g. by adding vibrations and / or acoustic signals for alerting the driver; and / or make an automatic braking of the vehicle 100, thereby enhancing traffic safety.
A typical example may be to activate output of a sensor configured for detecting unprotected road users on the right side of the vehicle 100 when the vehicle 100 is predicted to turn right, and vice versa.
Step 506 may be performed in some embodiments of the method 500 wherein the activation 505 of the advanced driver-assistance system 150 also comprises activation 505 of a vehicle external sensor 410. Step 506 comprises receiving 506 information from a transmitter 420, associated with the vehicle external sensor 410, via a wireless communication interface.
Thereby information concerning e.g. an obstacle 160 in the predicted 502 driving way of the vehicle 100 may be detected and output to the driver, and an accident may be avoided.
Step 507 may be performed in some embodiments of the method 500. Step 507 comprises outputting information of the activated 505 advanced driver-assistance system 150 to the driver of the vehicle 100.
Information may be outputted to the driver of the vehicle 100, only when an obstacle 160 is detected by the activated 505 advanced driver-assistance system 150 in the predicted 502 future driving direction of the vehicle 100.
The outputted 507 information may comprise information detected by the vehicle external sensor 410.
Step 508 may be performed only in some embodiments wherein the information is received 504 from the transmitter 130 when the vehicle 100 is going to change driving directions. Step 508 comprises determining that the vehicle 100 has completed the driving direction change.
Step 509 may be performed only in some embodiments wherein step 508 has been performed, comprises deactivating the advanced driver-assistance system 150 of the vehicle 100, or a human machine interface 310 thereof, when the driving direction change is determined 508 to be completed.
It is thereby avoided that the driver is distracted by irrelevant or redundant information and may instead focus on the current driving situation, enhancing traffic safety.
Figure 6 illustrates an embodiment of a system 600 for driver assistance. The system 600 is configured to perform at least some of the method steps 501-509, of the previously described method 500.
The system 600 comprises a control arrangement 300 for controlling an advanced driverassistance system 150 of the vehicle 100. The control arrangement 300 is configured to control an advanced driver-assistance system 150 of the vehicle 100, which advanced driverassistance system 150 is configured to monitor a vehicle environment in a future driving direction of the vehicle 100. The control arrangement 300 is arranged in the vehicle 100. The control arrangement 300 is configured to determine that the vehicle 100 is approaching an intersection 400, regulated by at least one traffic light 120, 220, 240. Further, the control arrangement 300 is configured to predict, before the vehicle 100 is passing the intersection 400, a future driving direction of the vehicle 100 after the intersection 400. Also, the control arrangement 300 is configured to identify the traffic light 120 regulating the traffic in the predieted future driving direction. The control arrangement 300 is furthermore configured to receive information from a transmitter 130, associated with the identified traffic light 120. In further addition, the control arrangement 300 is configured to activate at least a human machine interface 310 of the advanced driver-assistance system 150 of the vehicle 100, based on the received information.
In some embodiments, the control arrangement 300 is configured to, via the human machine interface 310, output information of the activated advanced driver-assistance system 150 to the driver of the vehicle 100.
In yet some embodiments, the control arrangement 300 is configured to output information to the driver of the vehicle 100, only when an obstacle 160 is detected by the activated advanced driver-assistance system 150, in the predicted future driving direction of the vehicle 100.
Also, in some embodiments, the control arrangement 300 may further be configured to receive the information from the transmitter 130 when the vehicle 100 is going to change driving directions, via the receiver 140. The control arrangement 300 may also be configured to determine that the vehicle 100 has completed the driving direction change. Furthermore, the control arrangement 300 is configured to deactivate the advanced driver-assistance system 150 of the vehicle 100 when the driving direction change is determined to be completed.
In yet some embodiments, the control arrangement 300 may be alternatively configured to activate a vehicle external sensor 410. The control arrangement 300 may further be configured to receive, via the receiver 140, information from a transmitter 420 associated with the vehicle external sensor 410. Further, the control arrangement 300 may be configured to output, via the output device 310 the information detected by the vehicle external sensor 410.
In yet some embodiments, the control arrangement 300 may be configured to activate the advanced driver-assistance system 150 at a time period before the traffic light 120 is changing states between green/ red.
The control arrangement 300 may be configured to determine that the vehicle 100 is approaching the intersection 400 based on input from a sensor 145 of the vehicle 100. Also, the control arrangement 300 may be configured to predict, before the vehicle 100 is passing the intersection 400, a future/ predicted driving direction of the vehicle 100 after the intersection 400 based on input from the sensor 145.
The control arrangement 300 may in some embodiments be configured to determine that the vehicle 100 is approaching the intersection 400, and predict, before the vehicle 100 is passing the intersection 400, a future driving direction of the vehicle 100 after the intersection 400, based on input from a positioning unit 320, configured to determine position of the positioning unit 320 based on signals received from satellites 330a, 330b, 330c, 330d of a satellite navigation system.
The system 600 may also comprise a traffic light 120.
In addition, the system 600 further may comprise a transmitter 130, associated with the traffic light 120.
Furthermore, the system 600 in addition may comprise a receiver 140, configured to receive information from the transmitter 130.
The control arrangement 300 may comprise a receiving circuit 610 configured for receiving wireless and / or wired signals from the receiver 140 and / or the sensor 145 The control arrangement 300 may also comprise a processing circuitry 620 configured for performing at least some of the calculating or computing of the control arrangement 300. Thus, the processing circuitry 620 may be configured for obtaining information from a traffic light 120, controlling traffic at an intersection 400.
Such processing circuitry 620 may comprise one or more instances of a processing circuit, i.e. a Central Processing Unit (CPU), a processing unit, a processing circuit, a processor, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret and execute instructions. The herein utilised expression “processing circuitry” may thus represent a processing device comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones enumerated above.
Furthermore, the control arrangement 300 may comprise a memory 625 in some embodiments. The optional memory 625 may comprise a physical device utilised to store data or programs, i.e., sequences of instructions, on a temporary or permanent basis. According to some embodiments, the memory 625 may comprise integrated circuits comprising siliconbased transistors. The memory 625 may comprise e.g. a memory card, a flash memory, a USB memory, a hard disc, or another similar volatile or non-volatile storage unit for storing data such as e.g. ROM (Read-Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), etc. in different embodiments.
Further, the control arrangement 300 may comprise a signal transmitter 630. The signal transmitter 630 may be configured for transmitting a control signal over a wired or wireless interface to a display or Human Machine Interface 310, which in turn may output information to the driver of the vehicle 100.
However, in some alternative embodiments, the system 600 may comprise additional units for performing the method 500 according to method steps 501-509.
Further, some embodiments may comprise a vehicle 100, comprising at least a part of the system 600, such as the control arrangement 300 and the receiver 140. In some embodiments, the vehicle 100 may comprise a sensor 145.
The computer program product mentioned above may be provided for instance in the form of a data carrier carrying computer program code for performing at least some of the step 501-505 according to some embodiments when being loaded into the one or more processing circuits 620 of the control arrangement 200. The data carrier may be, e.g., a hard disk, a CD ROM disc, a memory stick, an optical storage device, a magnetic storage device or any other appropriate medium such as a disk or tape that may hold machine readable data in a non-transitory manner. The computer program product may furthermore be provided as computer program code on a server and downloaded to the control arrangement 200 remotely, e.g., over an Internet or an intranet connection.
The terminology used in the description of the embodiments as illustrated in the accompanying drawings is not intended to be limiting of the described method 500; the control arrangement 200; the computer program; the system 600 and / or the vehicle 100. Various changes, substitutions and / or alterations may be made, without departing from invention embodiments as defined by the appended claims.
As used herein, the term "and/ or" comprises any and all combinations of one or more of the associated listed items. The term “or” as used herein, is to be interpreted as a mathematical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless expressly stated otherwise. In addition, the singular forms "a", "an" and "the" are to be interpreted as “at least one”, thus also possibly comprising a plurality of entities of the same kind, unless expressly stated otherwise. It will be further understood that the terms "includes", "comprises", "including" and / or "comprising", specifies the presence of stated features, actions, integers, steps, operations, elements, and / or components, but do not preclude the presence or addition of one or more other features, actions, integers, steps, operations, elements, components, and / or groups thereof. A single unit such as e.g. a processor may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. A computer program may be stored/ distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms such as via Internet or other wired or wireless communication system.

Claims (18)

PATENT CLAIMS
1. A method (500) for controlling an advanced driver-assistance system (150) of a vehicle (100), which advanced driver-assistance system (150) is configured to monitor a vehicle environment in a future driving direction of the vehicle (100), wherein the method (500) comprises the steps of: determining (501) that the vehicle (100) is approaching an intersection (400), regulated by at least one traffic light (120, 220, 240); predicting (502), before the vehicle (100) is passing the intersection (400), a future driving direction of the vehicle (100) after the intersection (400); identifying (503) the traffic light (120) regulating the traffic in the predicted (502) future driving direction; receiving (504) information from a transmitter (130), associated with the identified (503) traffic light (120); and activating (505) at least a human machine interface (310) of the advanced driverassistance system (150) of the vehicle (100), based on the received (504) information.
2. The method (500) according to claim 1, further comprising the step of: outputting (507) information of the activated (505) advanced driver-assistance system (150) to the driver of the vehicle (100).
3. The method (500) according to claim 2, wherein: information is outputted (507) to the driver of the vehicle (100), only when an obstacle (160) is detected by the activated (505) advanced driver-assistance system (150) in the predicted (502) future driving direction of the vehicle (100).
4. The method (500) according to any one of the preceding claims, wherein the information is received (504) from the transmitter (130) when the vehicle (100) is going to change driving directions, and wherein the method (500) further comprises the steps of: determining (508) that the vehicle (100) has completed the driving direction change; deactivating (509) the advanced driver-assistance system (150) of the vehicle (100) when the driving direction change is determined (508) to be completed.
5. The method (500) according to any one of the preceding claims, wherein the activation (505) of the advanced driver-assistance system (150) also comprises activation (505) of a vehicle external sensor (410); and wherein the method (500) further comprises the step of: receiving (506) information from a transmitter (420), associated with the vehicle external sensor (410); and wherein the outputted (507) information comprises information detected by the vehicle external sensor (410).
6. The method (500) according to any one of the preceding claims, wherein the activation (505) of the advanced driver-assistance system (150) is made at a time period before the traffic light (120) is changing states between green/ red.
7. The method (500) according to any one of the preceding claims, wherein the steps of: determining (501) that the vehicle (100) is approaching the intersection (400), regulated by at least one traffic light (120, 220, 240); and predicting (502), before the vehicle (100) is passing the intersection (400), a future driving direction of the vehicle (100) after the intersection (400); are based on input from a sensor (145) of the vehicle 100.
8. The method (500) according to any one of the preceding claims, wherein the steps of: determining (501) that the vehicle (100) is approaching the intersection (400), regulated by at least one traffic light (120, 220, 240); and predicting (502), before the vehicle (100) is passing the intersection (400), a future driving direction of the vehicle (100) after the intersection (400); are based on input from a positioning unit (320), configured to determine position of the positioning unit (320) based on signals received from satellites (330a, 330b, 330c, 330d) of a satellite navigation system.
9. A control arrangement (300) for controlling an advanced driver-assistance system (150) of a vehicle (100), which advanced driver-assistance system (150) is configured to monitor a vehicle environment in a future driving direction of the vehicle (100), wherein the control arrangement (300) is configured to: determine that the vehicle (100) is approaching an intersection (400), regulated by at least one traffic light (120, 220, 240); predict, before the vehicle (100) is passing the intersection (400), a future driving direction of the vehicle (100) after the intersection (400); identify the traffic light (120) regulating the traffic in the predicted future driving direction; receive information from a transmitter (130), associated with the identified traffic light (120); and activate at least a human machine interface (310) of the advanced driver-assistance system (150) of the vehicle (100), based on the received information.
10. The control arrangement (300) according to claim 9, configured to, via the human machine interface (310), output information of the activated advanced driver-assistance system (150) to the driver of the vehicle (100).
11. The control arrangement (300) according to claim 10, configured to output information to the driver of the vehicle (100), only when an obstacle (160) is detected by the activated advanced driver-assistance system (150), in the predicted future driving direction of the vehicle (100).
12. The control arrangement (300) according to any one of claims 9-11, configured to receive the information from the transmitter (130) when the vehicle (100) is going to change driving directions, via the receiver (140), and wherein the control arrangement (300) is configured to: determine that the vehicle (100) has completed the driving direction change; and deactivate the advanced driver-assistance system (150) of the vehicle (100) when the driving direction change is determined to be completed.
13. The control arrangement (300) according to any one of claims 9-12, configured to: activate a vehicle external sensor (410); and receive, via the receiver (140), information from a transmitter (420) associated with the vehicle external sensor (410); and output, via the output device (310) the information detected by the vehicle external sensor (410).
14. The control arrangement (300) according to any one of claims 9-13, configured to activate the advanced driver-assistance system (150) at a time period before the traffic light (120) is changing states between green/ red.
15. The control arrangement (300) according to any one of claims 9-14, configured to: determine that the vehicle (100) is approaching the intersection (400); and predict, before the vehicle (100) is passing the intersection (400), a future driving direction of the vehicle (100) after the intersection (400); based on input from a sensor (145) of the vehicle 100.
16. The control arrangement (300) according to any one of claims 9-15, configured to: determine that the vehicle (100) is approaching the intersection (400); and predict, before the vehicle (100) is passing the intersection (400), a future driving direction of the vehicle (100) after the intersection (400); based on input from a positioning unit (320), configured to determine position of the positioning unit (320) based on signals received from satellites (330a, 330b, 330c, 330d) of a satellite navigation system.
17. A computer program comprising program code for performing a method (500) according to any one of claims 1-8 when the computer program is executed in a control arrangement (300), according to any one of claims 9-16.
18. A system (600) for driver assistance, comprising: an advanced driver-assistance system (150) configured to monitor a vehicle environment of the vehicle (100); a control arrangement (300) according to any one of claims 9-16; a traffic light (120); a transmitter (130), associated with the traffic light (120); and a receiver (140), configured to receive information from the transmitter (130).
SE1850843A 2018-07-04 2018-07-04 Method and control arrangement for controlling an adas SE542785C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE1850843A SE542785C2 (en) 2018-07-04 2018-07-04 Method and control arrangement for controlling an adas
DE102019004481.9A DE102019004481A1 (en) 2018-07-04 2019-06-26 Method and arrangement for controlling a sophisticated driver assistance system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1850843A SE542785C2 (en) 2018-07-04 2018-07-04 Method and control arrangement for controlling an adas

Publications (2)

Publication Number Publication Date
SE1850843A1 SE1850843A1 (en) 2020-01-05
SE542785C2 true SE542785C2 (en) 2020-07-07

Family

ID=68943993

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1850843A SE542785C2 (en) 2018-07-04 2018-07-04 Method and control arrangement for controlling an adas

Country Status (2)

Country Link
DE (1) DE102019004481A1 (en)
SE (1) SE542785C2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3109125B1 (en) * 2020-04-14 2022-06-24 Renault Sas Method and device for assisting the driving of a motor vehicle in an intersection
DE102021104120A1 (en) * 2021-02-22 2022-08-25 Bayerische Motoren Werke Aktiengesellschaft Vehicle guidance system and method for operating a driving function at a traffic junction

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10507807B2 (en) 2015-04-28 2019-12-17 Mobileye Vision Technologies Ltd. Systems and methods for causing a vehicle response based on traffic light detection
DE102015210146A1 (en) 2015-06-02 2016-12-08 Robert Bosch Gmbh Method and device for avoiding a collision for motor vehicle by nachkommende motor vehicles to a traffic signal system
KR102384175B1 (en) 2015-07-29 2022-04-08 주식회사 만도모빌리티솔루션즈 Camera device for vehicle
DE102016012054A1 (en) 2016-10-07 2017-06-01 Daimler Ag Method for operating a pedestrian protection system of a vehicle

Also Published As

Publication number Publication date
DE102019004481A1 (en) 2020-01-09
SE1850843A1 (en) 2020-01-05

Similar Documents

Publication Publication Date Title
CN110036423B (en) Method and control unit for adjusting the inter-vehicle distance between vehicles in a vehicle train
US10282999B2 (en) Road construction detection systems and methods
US10800455B2 (en) Vehicle turn signal detection
US9983591B2 (en) Autonomous driving at intersections based on perception data
US10210406B2 (en) System and method of simultaneously generating a multiple lane map and localizing a vehicle in the generated map
US11315424B2 (en) Automotive driver assistance
KR102089706B1 (en) Vehicle method and control unit for estimating stretch of road based on a set of marks of other vehicles
US11518394B2 (en) Automotive driver assistance
US20150153184A1 (en) System and method for dynamically focusing vehicle sensors
US11414073B2 (en) Automotive driver assistance
US10967972B2 (en) Vehicular alert system
US20180072220A1 (en) Collision Avoidance System for Vehicles
US11608606B2 (en) Method and control unit for ground bearing capacity analysis
US11820387B2 (en) Detecting driving behavior of vehicles
US10909848B2 (en) Driving assistance device
JP7362733B2 (en) Automated crowdsourcing of road environment information
US11600076B2 (en) Detection of a hazardous situation in road traffic
SE542785C2 (en) Method and control arrangement for controlling an adas
US20200219399A1 (en) Lane level positioning based on neural networks
WO2020046187A1 (en) Method and control arrangement for calculating an appropriate vehicle speed
WO2022177495A1 (en) Method and control arrangement for estimating relevance of location-based information of another vehicle