WO2017030494A1 - Procédé, unité de commande et système de détection et de suivi d'usagers de la route vulnérables - Google Patents

Procédé, unité de commande et système de détection et de suivi d'usagers de la route vulnérables Download PDF

Info

Publication number
WO2017030494A1
WO2017030494A1 PCT/SE2016/050762 SE2016050762W WO2017030494A1 WO 2017030494 A1 WO2017030494 A1 WO 2017030494A1 SE 2016050762 W SE2016050762 W SE 2016050762W WO 2017030494 A1 WO2017030494 A1 WO 2017030494A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
vru
sensor
camera
detecting
Prior art date
Application number
PCT/SE2016/050762
Other languages
English (en)
Inventor
Jonny Andersson
Marie BEMLER
Joseph Ah-King
Christian Larsson
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to DE112016003241.2T priority Critical patent/DE112016003241T5/de
Publication of WO2017030494A1 publication Critical patent/WO2017030494A1/fr

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/34Protecting non-occupants of a vehicle, e.g. pedestrians
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/86Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9316Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9322Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using additional data, e.g. driver condition, road state or weather data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9324Alternative operation using ultrasonic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

Definitions

  • This document relates to a method, a control unit and a system in a vehicle. More particu- larly, a method, a control unit and a system is described, for detecting and tracking a Vulnerable Road User (VRU).
  • VRU Vulnerable Road User
  • Non-motorised road users such as e.g. pedestrians and cyclists as well as motorcyclists and persons with disabilities and/ or reduced mobility and orientation are sometimes referred to as Vulnerable Road Users (VRUs).
  • VRUs Vulnerable Road Users
  • a particularly dangerous scenario is when VRUs are situated in the vehicle driver's blind spot when the vehicle is turning at low speeds.
  • RADAR and LIDAR sensors generally cannot discriminate between road users and stationary objects unless the detected object has been seen to be moving. Consequently, a stationary pedestrian waiting at a pedestrian crossing can generally not be separated from other stationary objects, such as a lamppost in a RADAR or LI DAR based system.
  • this objective is achieved by a method in a vehicle for detecting and tracking a Vulnerable Road User (VRU).
  • the method comprises detecting an object by a camera of the vehicle; classifying the detected object as a VRU and making a movement prediction reliability estimation of the VRU, wherein unattended animals and people shorter than a configurable threshold length are classified as having reduced movement prediction reliability.
  • the method further comprises detecting the object by a sensor of the vehicle; mapping the classified VRU with the object detected by the sensor; and tracking the VRU by the sensor.
  • this objective is achieved by a control unit in a vehicle.
  • the control unit aims at detecting and tracking a VRU.
  • the control unit is configured for detecting an object by a camera of the vehicle in accordance with the above.
  • this objective is achieved by a computer program comprising program code for performing a method according to the first aspect when the computer program is executed in a control unit according to the second aspect.
  • a system for detecting and tracking a VRU comprises a control unit according to the second aspect. Further the system also comprises a camera of the vehicle, for detecting an object. The system additionally comprises at least one sensor of the vehicle, for detecting the object.
  • An accurate VRU tracking is essential e.g. for creating a reliable VRU warning system that warns/ intervenes when a collision with a VRU is really probable, i.e. when the predicted path of the vehicle and a predicted path for the VRU are overlapping.
  • Such system will gain high acceptance and trust as superfluous warnings are eliminated or at least reduced, which in turn is expected to reduce fatalities of turn accidents.
  • increased traffic security is achieved.
  • the traffic security may be enhanced as the driver's attention could be increased in situations where the VRU is in particular unpredictable.
  • superfluous warnings for adult, reliable VRUs
  • the system gains high acceptance of the driver, leading to that the driver gets more attentive to alert signals.
  • Figure 1 illustrates a vehicle according to an embodiment of the invention
  • Figure 2A illustrates an example of a traffic scenario and an embodiment of the invention
  • Figure 2B illustrates an example of a traffic scenario and an embodiment of the invention
  • Figure 3 illustrates an example of a vehicle interior according to an embodiment
  • Figure 4 is a flow chart illustrating an embodiment of the method
  • Figure 5 is an illustration depicting a system according to an embodiment.
  • Embodiments of the invention described herein are defined as a method, a control unit and a system , which may be put into practice in the embodiments described below. These embodiments may, however, be exemplified and realised in many different forms and are not to be limited to the examples set forth herein ; rather, these illustrative examples of embodiments are provided so that this disclosure will be thorough and complete.
  • Figure 1 illustrates a scenario with a vehicle 100.
  • the vehicle 100 is driving on a road in a driving direction 105.
  • the vehicle 100 may comprise e.g. a truck, a bus or a car, or any similar vehicle or other means of conveyance.
  • the herein described vehicle 100 may be driver controlled or driverless, autonomously controlled vehicles 100 in some embodiments. However, for enhanced clarity, they are subsequently described as having a driver.
  • the vehicle 100 comprises a camera 110 and a sensor 120.
  • the camera 1 10 may be situated e.g. at the front of the vehicle 100, behind the windscreen of the vehicle 100.
  • An advantage by placing the camera 1 10 behind the windscreen is that the camera 1 10 is protected from dirt, snow, rain and to some extent also from damage, vandalism and/ or theft.
  • the camera 1 10 may be directed towards the front of the vehicle 100, in the driving direction 105. Thereby, the camera 1 10 may detect a VRU in the driving direction 105 ahead of the vehicle 100.
  • the camera may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, or a time-of-flight camera in different embodiments.
  • Mounting the camera 1 10 behind the windshield have some advantages compared to externally mounted camera systems. These advantages include the possibility to use windshield wipers for cleaning and using the light from headlights to illuminate objects in the camera's field of view. Such multi-function camera 1 10 can also be used for a variety of other tasks.
  • the sensor 120 may be situated at the side of the vehicle 100, arranged to detect objects at the side of the vehicle 100.
  • the sensor 120 may comprise e.g. a radar, a lidar, an ultrasound device, a time-of-flight camera, or similar in different embodiments.
  • the sensor 120 may comprise e.g.
  • PIR Passive Infrared
  • the advantages of the sensor 120 are the detection range, price, robustness and 5 ability to operate in all weather conditions. Thereby high confidence detections and classifications may be achieved. Thanks to the combination of the camera 1 10, which may detect the VRU also when it is stationary, and the sensor 120, which may track any VRU detected by the camera 1 10, a high performance function of a VRU warning/ intervention system is achieved, possibly without adding any side viewing camera to the vehicle 100. Thereby the 10 need for dedicated side viewing VRU detection sensors may be eliminated.
  • the side-looking sensor 120 and the camera 110 do not necessarily require having overlapping fields of view; they may as well have fields of view adjacent to each other, 20 or with a gap in between.
  • a calculation may in the latter case be made for mapping an object detected by the camera 1 10 with the same object detected by the side-looking sensor 120, in some embodiments.
  • FIG 2A schematically illustrates a scenario, similar to the previously discussed scenario 25 illustrated in Figure 1 , but with the vehicle 100 seen from an above perspective and wherein a VRU 200 is depicted.
  • the camera 1 10 detects the VRU 200.
  • An image recognition program may recognise the VRU 200 as a VRU and pos- 30 sibly also categorise it as e.g. a pedestrian, child, bicyclist, animal etc.
  • the VRU 200 As the vehicle 100 is driving forward in the driving direction 105 and approaching the VRU 200, the VRU 200 for a moment becomes situated in an area where it is detected both by the camera 110 and the sensor 120. The VRU 200 may then be mapped with the object 35 200 detected by the sensor 120. Thereby it becomes possible for the sensor 120 to recognise the VRU 200 as a VRU, also when the VRU 200 is stationary. As the vehicle 100 is advancing in the driving direction 105, the VRU 200 becomes out of sight for the camera 1 10 while still being situated within range of the sensor 120, as illustrated in Figure 2B. The VRU 200 may then be tracked by the sensor 120 for as long it is situated within detection range of the sensor 120.
  • An accurate detection and tracking of any VRU 200 in the proximity of the vehicle 100 is the backbone for creating a reliable VRU warning system that only warns/ intervenes when a collision with a VRU is really probable and impending. Such system will gain higher acceptance and trust which in turn is expected to reduce fatalities of turn accidents.
  • VRU detection is not limited to VRU warning systems, but may be used for various other purposes.
  • Figure 3 illustrates an example of a vehicle interior of the vehicle 100 and depicts how the previously scenario in Figure 1 and/ or Figure 2A may be perceived by the driver of the vehicle 100.
  • the vehicle 100 comprises a control unit 310.
  • the control unit 310 is able to recognise the VRU 200 as a VRU, based on one or more images provided by the camera 1 10. Further the control unit 310 is configured for receiving detection signals from the sensor 120 and mapping the detected VRU with the detection signals received from the sensor 120. Also, the control unit 310 is further configured for tracking the VRU 200 via the sensor 120, as long as the VRU 200 is within range of the sensor 120.
  • the vehicle 100 may comprise one sensor 120-1 on the right side of the vehicle 100 and one sensor 120-2 on the left side in some embodiments. However, in other embodiments, the vehicle 100 may comprise only one sensor 120 on the right side of the vehicle 100, thereby reducing the number of sensors 120 in the vehicle 100.
  • the vehicle 100 may comprise a plurality of sensors 120 on each side of the vehicle 100.
  • the sensors 120 may be of the same, or different types, such as e.g. radar, lidar, ultrasound, time-of-flight camera, etc.
  • the vehicle 100 comprises one camera 1 10 situated in front of the vehicle 100 behind the windscreen.
  • the vehicle 100 may comprise a camera 1 10 situated at the rear part of the vehicle 100, directed in a direction opposite to the normal driving direction 105.
  • detection of VRUs 200 may be made while backing the vehicle 100.
  • the camera 1 10 may in such case be situated inside the rear glass, in order to be protected from dirt, snow, etc.
  • the control unit 310 may communicate with the camera 1 10 and sensor 120, e.g. via a communication bus of the vehicle 100, or via a wired or wireless connection.
  • Figure 4 illustrates an example of a method 400 according to an embodiment.
  • the flow chart in Figure 4 shows the method 400 for use in a vehicle 100.
  • the method 400 aims at detecting and tracking a VRU 200.
  • the vehicle 100 may be e.g. a truck, a bus, a car, a motorcycle or similar.
  • the method 400 may comprise a number of steps 401 -405. However, some of these steps 401 -405 may be per- formed in various alternative manners. Further, the described steps 401 -405 may be performed in a somewhat different chronological order than the numbering suggests.
  • the method 400 may comprise the subsequent steps:
  • Step 401 comprises detecting an object 200 by a camera 1 10 of the vehicle 100.
  • Step 402 comprises classifying the detected 401 object 200 as a VRU 200.
  • the classification of the detected 401 object 200 may be made based on image recognition.
  • the classification may further comprise a movement prediction reliability estimation of the VRU 200, wherein unattended animals and people shorter than a configurable threshold length are classified as having reduced movement prediction reliability.
  • the classification may in further addition also comprise a movement prediction reliability estimation of the VRU 200, wherein motorcycle drivers are classified as having enhanced movement prediction reliability.
  • Step 403 comprises detecting the object 200 by a sensor 120 of the vehicle 100.
  • Step 404 comprises mapping the classified 402 VRU 200 with the object 200 detected 403 by the sensor 120.
  • Step 405 comprises tracking the VRU 200 by the sensor 120.
  • Figure 5 illustrates an embodiment of a system 500 for detecting and tracking a VRU 200.
  • the system 500 may perform at least some of the previously described steps 401 -405 according to the method 400 described above and illustrated in Figure 4.
  • the system 500 comprises a control unit 310 in the vehicle 100.
  • the control unit 310 is arranged for detecting and tracking the VRU 200.
  • the control unit 310 is configured for detecting an object 200 by a camera 1 10 of the vehicle 100. Further the control unit 310 is configured for classifying the detected object 200 as a VRU 200.
  • the control unit 310 is also configured for detecting the object 200 by a sensor 120 of the vehicle 100. In addition the control unit 310 is furthermore also configured for mapping the classified VRU 200 with the object 200 detected by the sensor 120.
  • the control unit 310 is configured for tracking the VRU 200 by the sensor 120.
  • the control unit 310 may also be configured for classifying the detected object 200 based on image recognition in some embodiments. Further the control unit 310 may be configured for making a movement prediction reliability estimation of the VRU 200, wherein unat- tended animals and people shorter than a configurable threshold length are classified as having reduced movement prediction reliability. In some embodiments, the control unit 310 may be configured for making a movement prediction reliability estimation of the VRU 200, wherein motorcycle drivers are classified as having enhanced movement prediction reliability.
  • the control unit 310 comprises a receiving circuit 510 configured for receiving a signal from the camera 1 10 and the sensor 120.
  • control unit 310 comprises a processor 520 configured for performing at least some steps of the method 400, according to some embodiments.
  • Such processor 520 may comprise one or more instances of a processing circuit, i.e. a Central Processing Unit (CPU), a processing unit, a processing circuit, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may inter- pret and execute instructions.
  • a processing circuit i.e. a Central Processing Unit (CPU), a processing unit, a processing circuit, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may inter- pret and execute instructions.
  • the herein utilised expression "processor” may thus represent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones enumerated above.
  • the control unit 310 may comprise a memory 525 in some embodiments.
  • the optional memory 525 may comprise a physical device utilised to store data or programs, i.e., sequences of instructions, on a temporary or permanent basis.
  • the memory 525 may comprise integrated circuits comprising silicon-based transistors.
  • the memory 525 may comprise e.g. a memory card, a flash memory, a USB memory, a hard disc, or another similar volatile or non-volatile storage unit for storing data such as e.g. ROM (Read-Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), etc. in different em- 10 bodiments.
  • control unit 310 may comprise a signal transmitter 530 in some embodiments.
  • the signal transmitter 530 may be configured for transmitting a signal to e.g. a display device, or a VDU warning system or warning device, for example.
  • the system 500 also comprises a camera 1 10 of the vehicle 100, for detecting an object 200.
  • the camera 1 10 may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera or similar. Further the camera 1 10 may be situated behind the windshield of the vehicle 100, directed forward in the driving direction 105 of the vehicle 20 100.
  • the system 500 may also comprise at least one sensor 120 of the vehicle 100, for detecting the object 200.
  • the at least one sensor 120 may be situated on a side of the vehicle 100 and may comprise any of a radar, lidar, ultrasonic sensor, time- of -flight cam- 25 era, or thermal camera in some embodiments.
  • the camera 1 10 and the at least one sensor 120 may have overlapping fields of view in some embodiments.
  • steps 401 -405 to be performed in the vehicle 100 may be implemented through the one or more processors 520 within the control unit 310, together with computer program product for performing at least some of the functions of the steps 401 - 405.
  • a computer program product comprising instructions for performing the steps 401 -405 in the control unit 310 may perform the method 400 comprising at least some of
  • some embodiments of the invention may comprise a vehicle 100, comprising the control unit 310, configured for predicting a path of a vehicle 100, according to at least some of the steps 401 -405.
  • the computer program product mentioned above may be provided for instance in the form of a data carrier carrying computer program code for performing at least some of the steps 401 -405 according to some embodiments when being loaded into the one or more processors 520 of the control unit 310.
  • the data carrier may be, e.g., a hard disk, a CD ROM disc, a memory stick, an optical storage device, a magnetic storage device or any other appro- priate medium such as a disk or tape that may hold machine readable data in a non- transitory manner.
  • the computer program product may furthermore be provided as computer program code on a server and downloaded to the control unit 310 remotely, e.g., over an Internet or an intranet connection.
  • the term “and/ or” comprises any and all combinations of one or more of the associated listed items.
  • the term “or” as used herein, is to be interpreted as a mathematical OR, i.e. , as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless expressly stated otherwise.
  • the singular forms “a”, “an” and “the” are to be interpreted as “at least one”, thus also possibly comprising a plurality of entities of the same kind, unless expressly stated otherwise.

Abstract

L'invention concerne un procédé (400) et une unité de commande (310) de détection et de suivi d'un usager de la route vulnérable (200). Le procédé (400) comprend les étapes consistant à : détecter (401) un objet (200) au moyen d'une caméra (110) du véhicule (100) ; classer (402) l'objet (200) détecté en (401) en tant qu'usager de la route vulnérable (200) ; détecter (403) l'objet (200) au moyen d'un capteur (120) du véhicule (100) ; mettre en correspondance (404) l'usager de la route vulnérable (200) classé en (402) et l'objet (200) détecté en (403) au moyen du capteur (120) ; et suivre (405) l'usager de la route vulnérable (200) au moyen du capteur (120).
PCT/SE2016/050762 2015-08-20 2016-08-16 Procédé, unité de commande et système de détection et de suivi d'usagers de la route vulnérables WO2017030494A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE112016003241.2T DE112016003241T5 (de) 2015-08-20 2016-08-16 Verfahren, Steuereinheit und System zum Erkennen und Verfolgen verletzlicher Verkehrsteilnehmer

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1551087-8 2015-08-20
SE1551087A SE539846C2 (en) 2015-08-20 2015-08-20 Method, control unit and a system in a vehicle for detection of a vulnerable road user

Publications (1)

Publication Number Publication Date
WO2017030494A1 true WO2017030494A1 (fr) 2017-02-23

Family

ID=58051369

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2016/050762 WO2017030494A1 (fr) 2015-08-20 2016-08-16 Procédé, unité de commande et système de détection et de suivi d'usagers de la route vulnérables

Country Status (3)

Country Link
DE (1) DE112016003241T5 (fr)
SE (1) SE539846C2 (fr)
WO (1) WO2017030494A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10235882B1 (en) 2018-03-19 2019-03-19 Derq Inc. Early warning and collision avoidance
EP3936416A1 (fr) * 2020-07-10 2022-01-12 Volvo Truck Corporation Véhicule à moteur comprenant un pare-feu de carrosserie de cabine comportant une traverse inférieure et une traverse supérieure
GB2569654B (en) * 2017-12-22 2022-09-07 Sportlight Tech Ltd Apparatusses, systems and methods for object tracking
US11443631B2 (en) 2019-08-29 2022-09-13 Derq Inc. Enhanced onboard equipment

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020203729A1 (de) 2020-03-23 2021-09-23 Volkswagen Aktiengesellschaft Verfahren zum Betreiben eines Kraftfahrzeugs
DE102022206123A1 (de) * 2022-06-20 2023-12-21 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Ermittlung einer angenäherten Objektposition eines dynamischen Objektes, Computerprogramm, Vorrichtung und Fahrzeug

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1095832A1 (fr) * 1999-10-27 2001-05-02 Director General of Public Works Research Institute, Ministry of Construction Système pour la prévention des collisons entre des véhicules et des piétons
EP2133851A1 (fr) * 2007-04-02 2009-12-16 Panasonic Corporation Dispositif d'aide pour une conduite sûre
US20100205132A1 (en) * 2007-08-27 2010-08-12 Toyota Jidosha Kabushiki Kaisha Behavior predicting device
US20110246156A1 (en) * 2008-12-23 2011-10-06 Continental Safety Engineering International Gmbh Method for Determining the Probability of a Collision of a Vehicle With a Living Being
US20120035846A1 (en) * 2009-04-14 2012-02-09 Hiroshi Sakamoto External environment recognition device for vehicle and vehicle system using same
US20130282277A1 (en) * 2012-04-24 2013-10-24 Zetta Research and Development, LLC - ForC Series Generating a location in a vehicle-to-vehicle communication system
US8954252B1 (en) * 2012-09-27 2015-02-10 Google Inc. Pedestrian notifications

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1095832A1 (fr) * 1999-10-27 2001-05-02 Director General of Public Works Research Institute, Ministry of Construction Système pour la prévention des collisons entre des véhicules et des piétons
EP2133851A1 (fr) * 2007-04-02 2009-12-16 Panasonic Corporation Dispositif d'aide pour une conduite sûre
US20100205132A1 (en) * 2007-08-27 2010-08-12 Toyota Jidosha Kabushiki Kaisha Behavior predicting device
US20110246156A1 (en) * 2008-12-23 2011-10-06 Continental Safety Engineering International Gmbh Method for Determining the Probability of a Collision of a Vehicle With a Living Being
US20120035846A1 (en) * 2009-04-14 2012-02-09 Hiroshi Sakamoto External environment recognition device for vehicle and vehicle system using same
US20130282277A1 (en) * 2012-04-24 2013-10-24 Zetta Research and Development, LLC - ForC Series Generating a location in a vehicle-to-vehicle communication system
US8954252B1 (en) * 2012-09-27 2015-02-10 Google Inc. Pedestrian notifications

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2569654B (en) * 2017-12-22 2022-09-07 Sportlight Tech Ltd Apparatusses, systems and methods for object tracking
US11624825B2 (en) 2017-12-22 2023-04-11 Sportlight Technology Ltd. Object tracking
US11257370B2 (en) 2018-03-19 2022-02-22 Derq Inc. Early warning and collision avoidance
US10950130B2 (en) 2018-03-19 2021-03-16 Derq Inc. Early warning and collision avoidance
US11257371B2 (en) 2018-03-19 2022-02-22 Derq Inc. Early warning and collision avoidance
US10235882B1 (en) 2018-03-19 2019-03-19 Derq Inc. Early warning and collision avoidance
US11276311B2 (en) 2018-03-19 2022-03-15 Derq Inc. Early warning and collision avoidance
US10854079B2 (en) 2018-03-19 2020-12-01 Derq Inc. Early warning and collision avoidance
US10565880B2 (en) 2018-03-19 2020-02-18 Derq Inc. Early warning and collision avoidance
US11749111B2 (en) 2018-03-19 2023-09-05 Derq Inc. Early warning and collision avoidance
US11763678B2 (en) 2018-03-19 2023-09-19 Derq Inc. Early warning and collision avoidance
US11443631B2 (en) 2019-08-29 2022-09-13 Derq Inc. Enhanced onboard equipment
US11688282B2 (en) 2019-08-29 2023-06-27 Derq Inc. Enhanced onboard equipment
EP3936416A1 (fr) * 2020-07-10 2022-01-12 Volvo Truck Corporation Véhicule à moteur comprenant un pare-feu de carrosserie de cabine comportant une traverse inférieure et une traverse supérieure
US11560123B2 (en) 2020-07-10 2023-01-24 Volvo Truck Corporation Motor vehicle comprising a cab body firewall with a lower crossbeam and an upper crossbeam

Also Published As

Publication number Publication date
SE1551087A1 (sv) 2017-02-21
SE539846C2 (en) 2017-12-19
DE112016003241T5 (de) 2018-05-03

Similar Documents

Publication Publication Date Title
US11056002B2 (en) Method, control unit and system for avoiding collision with vulnerable road users
WO2017030494A1 (fr) Procédé, unité de commande et système de détection et de suivi d'usagers de la route vulnérables
EP3141926B1 (fr) Détection automatisée de dérive dangereux de véhicules automobiles par des capteurs de véhicules
KR102050525B1 (ko) 횡단보도에서의 사고를 방지하는 방법 및 제어 유닛
EP3414131B1 (fr) Système de réduction d'angle mort pour un véhicule
US20110215947A1 (en) System and method for collision warning
US9251709B2 (en) Lateral vehicle contact warning system
US20170038466A1 (en) Detection of an object by use of a 3d camera and a radar
Forslund et al. Night vision animal detection
KR102130059B1 (ko) 디지털 백미러 제어 유닛 및 방법
SE1550100A1 (sv) Method, control unit and system for warning
US20160299224A1 (en) Active radar activated anti-collision apparatus
US20080164983A1 (en) System for the Detection of Objects Located in an External Front-End Zone of a Vehicle, Which Is Suitable for Industrial Vehicles
RU2706757C1 (ru) Способ и блок управления для заднего обзора
JP6087240B2 (ja) 車両周辺監視装置
CN111936376B (zh) 障碍物识别方法
Tsuchiya et al. Real-time vehicle detection using a single rear camera for a blind spot warning system
SE540357C2 (en) Method and control unit in a vehicle for informing another road user of a sweep area
Rammohan et al. Automotive Collision Avoidance System: A Review
Mukhtar et al. On-road approaching motorcycle detection and tracking techniques: A survey
EP4113478A1 (fr) Système de capteur, dispositif de commande, support lisible par ordinateur non transitoire et programme informatique
GB2615290A (en) Blindspot assist system for a vehicle
Hsu et al. Object detection and tracking using an optical time-of-flight range camera module for vehicle safety and driver assist applications

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16837400

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 112016003241

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16837400

Country of ref document: EP

Kind code of ref document: A1