SE539846C2 - Method, control unit and a system in a vehicle for detectionof a vulnerable road user - Google Patents

Method, control unit and a system in a vehicle for detectionof a vulnerable road user Download PDF

Info

Publication number
SE539846C2
SE539846C2 SE1551087A SE1551087A SE539846C2 SE 539846 C2 SE539846 C2 SE 539846C2 SE 1551087 A SE1551087 A SE 1551087A SE 1551087 A SE1551087 A SE 1551087A SE 539846 C2 SE539846 C2 SE 539846C2
Authority
SE
Sweden
Prior art keywords
vehicle
vru
sensor
camera
detecting
Prior art date
Application number
SE1551087A
Other languages
Swedish (sv)
Other versions
SE1551087A1 (en
Inventor
Andersson Jonny
Bemler Marie
Ah-King Joseph
Larsson Christian
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE1551087A priority Critical patent/SE539846C2/en
Priority to PCT/SE2016/050762 priority patent/WO2017030494A1/en
Priority to DE112016003241.2T priority patent/DE112016003241T5/en
Publication of SE1551087A1 publication Critical patent/SE1551087A1/en
Publication of SE539846C2 publication Critical patent/SE539846C2/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/34Protecting non-occupants of a vehicle, e.g. pedestrians
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/86Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9316Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9322Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using additional data, e.g. driver condition, road state or weather data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9324Alternative operation using ultrasonic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Mechanical Engineering (AREA)
  • Acoustics & Sound (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Description

METHOD, CONTROL UNIT AND A SYSTEM IN A VEHICLE FOR DETECTION OF A VULNERABLE ROAD USER TECHNICAL FIELD This document relates to a method, a control unit and a system in a vehicle. More particularly, a method, a control unit and a system is described, for detecting and tracking a Vulnerable Road User (VRU).
BACKGROUND Non-motorised road users, such as e.g. pedestrians and cyclists as well as motorcyclists and persons with disabilities and/ or reduced mobility and orientation are sometimes referred to as Vulnerable Road Users (VRUs). This heterogeneous group is disproportionately represented in statistics on injuries and road traffic casualties.
A particularly dangerous scenario is when VRUs are situated in the vehicle driver's blind spot when the vehicle is turning at low speeds.
No advanced warning systems for VRUs in a vehicle's blind zone is yet known. Simple systems exist on the market today, which are based on ultrasonic sensors which identify the presence of "anything" next to the vehicle when turning or when using turn indicators. However, such systems are not able to distinguish between a lamppost and a stationary pedestrian, for example.
Environment sensors such as RADAR, LIDAR, or cameras all have different advantages and disadvantages. For example, a camera system can be affected by dirt, snow, fog, and bad lighting conditions in a way that highly affects the usefulness of the camera. Also, camera systems are generally expensive. RADAR and LIDAR sensors generally cannot discriminate between road users and stationary objects unless the detected object has been seen to be moving. Consequently, a stationary pedestrian waiting at a pedestrian crossing can generally not be separated from other stationary objects, such as a lamppost in a RADAR or LIDAR based system.
Thus it would be desired to discover a method for detecting VRUs from a vehicle, and distinguish a VRU from a stationary object, which may be used e.g. in a VRU warning system.
SUMMARY It is therefore an object of this invention to solve at least some of the above problems and improve the traffic security.
According to a first aspect of the invention, this objective is achieved by a method in a vehicle for detecting and tracking a Vulnerable Road User (VRU). The method comprises detecting an object by a camera of the vehicle. Further the method comprises classifying the detected object as a VRU. In addition the method further comprises detecting the object by a sensor of the vehicle. The method furthermore comprises mapping the classified VRU with the object detected by the sensor. Also, the method further comprises tracking the VRU by the sensor.
According to a second aspect of the invention, this objective is achieved by a control unit in a vehicle. The control unit aims at detecting and tracking a VRU. The control unit is configured for detecting an object by a camera of the vehicle. Further the control unit is configured for classifying the detected object as a VRU. The control unit is in addition configured for detecting the object by a sensor of the vehicle. Also, the control unit is furthermore configured for mapping the classified VRU with the object detected by the sensor. The control unit is also configured for tracking the VRU by the sensor.
According to a third aspect of the invention, this objective is achieved by a computer program comprising program code for performing a method according to the first aspect when the computer program is executed in a control unit according to the second aspect.
According to a fourth aspect, this objective is achieved by a system for detecting and tracking a VRU. The system comprises a control unit according to the second aspect. Further the system also comprises a camera of the vehicle, for detecting an object. The system additionally comprises at least one sensor of the vehicle, for detecting the object.
Thanks to the described aspects, by combining the respective advantages of the camera and the sensor VRUs may be detected and tracked in an effective manner. An accurate VRU tracking is essential e.g. for creating a reliable VRU warning system that warns/ intervenes when a collision with a VRU is really probable, i.e. when the predicted path of the vehicle and a predicted path for the VRU are overlapping. Such system will gain high acceptance and trust as superfluous warnings are eliminated or at least reduced, which in turn is expected to reduce fatalities of turn accidents. Thus increased traffic security is achieved. Other advantages and additional novel features will become apparent from the subsequent detailed description.
FIGURES Embodiments of the invention will now be described in further detail with reference to the accompanying figures, in which:Figure 1illustrates a vehicle according to an embodiment of the invention;Figure 2Aillustrates an example of a traffic scenario and an embodiment of the inven tion;Figure 2Billustrates an example of a traffic scenario and an embodiment of the inven tion;Figure 3illustrates an example of a vehicle interior according to an embodiment; Figure 4is a flow chart illustrating an embodiment of the method; and Figure 5is an illustration depicting a system according to an embodiment.
DETAILED DESCRIPTION Embodiments of the invention described herein are defined as a method, a control unit and a system, which may be put into practice in the embodiments described below. These embodiments may, however, be exemplified and realised in many different forms and are not to be limited to the examples set forth herein; rather, these illustrative examples of embodiments are provided so that this disclosure will be thorough and complete.
Still other objects and features may become apparent from the following detailed description, considered in conjunction with the accompanying drawings. It is to be understood, however, that the drawings are designed solely for purposes of illustration and not as a definition of the limits of the herein disclosed embodiments, for which reference is to be made to the appended claims. Further, the drawings are not necessarily drawn to scale and, unless otherwise indicated, they are merely intended to conceptually illustrate the structures and procedures described herein.
Figure 1illustrates a scenario with a vehicle 100. The vehicle 100 is driving on a road in a driving direction 105.
The vehicle 100 may comprise e.g. a truck, a bus or a car, or any similar vehicle or other means of conveyance.
Further, the herein described vehicle 100 may be driver controlled or driverless, autonomously controlled vehicles 100 in some embodiments. However, for enhanced clarity, they are subsequently described as having a driver.
The vehicle 100 comprises a camera110and a sensor 120. In the illustrated embodiment, which is merely an arbitrary example, the camera 110 may be situated e.g. at the front of the vehicle 100, behind the windscreen of the vehicle 100. An advantage by placing the camera 110 behind the windscreen is that the camera 110 is protected from dirt, snow, rain and to some extent also from damage, vandalism and/ or theft.
The camera 110 may be directed towards the front of the vehicle 100, in the driving direction 105. Thereby, the camera 110 may detect a VRU in the driving direction 105 ahead of the vehicle 100. The camera may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, or a time-of-flight camera in different embodiments.
Mounting the camera 110 behind the windshield (looking forward) have some advantages compared to externally mounted camera systems. These advantages include the possibility to use windshield wipers for cleaning and using the light from headlights to illuminate objects in the camera's field of view. Such multi-function camera 110 can also be used for a variety of other tasks.
The sensor 120 may be situated at the side of the vehicle 100, arranged to detect objects at the side of the vehicle 100. The sensor 120 may comprise e.g. a radar, a lidar, an ultrasound device, a time-of-flight camera, or similar in different embodiments.
In some embodiments, the sensor 120 may comprise e.g. a motion detector and/ or be based on a Passive Infrared (PIR) sensor sensitive to a person's skin temperature through emitted black body radiation at mid-infrared wavelengths, in contrast to background objects at room temperature; or by emitting a continuous wave of microwave radiation and detect motion through the principle of Doppler radar; or by emitting an ultrasonic wave an detecting and analysing the reflections; or by a tomographic motion detection system based on detection of radio wave disturbances, to mention some possible implementations.
By using at least one camera 110 and at least one sensor 120, the advantages of the respective type of device may be combined. The advantage of the camera 110 is that it is enabled to distinguish between e.g. a VRU and another object, also when the VRU is stationary. The advantages of the sensor 120 are the detection range, price, robustness and ability to operate in all weather conditions. Thereby high confidence detections and classifications may be achieved. Thanks to the combination of the camera 110, which may detect the VRU also when it is stationary, and the sensor 120, which may track any VRU detected by the camera 110, a high performance function of a VRU warning/ intervention system is achieved, possibly without adding any side viewing camera to the vehicle 100. Thereby the need for dedicated side viewing VRU detection sensors may be eliminated.
By having overlapping fields of view with a side-looking sensor 120 and the camera 110, stationary VRUs can be first detected with the camera 110 when passing them and then "tracked" with the sensor 120 outside the field of view from the camera 110. This allows for VRU warning/ intervention on stationary objects even outside the field of view from the camera 110 which is required for VRU warning in the driver's blind spot.
However, the side-looking sensor 120 and the camera 110 do not necessarily require having overlapping fields of view; they may as well have fields of view adjacent to each other, or with a gap in between. A calculation may in the latter case be made for mapping an object detected by the camera 110 with the same object detected by the side-looking sensor 120, in some embodiments.
Figure 2Aschematically illustrates a scenario, similar to the previously discussed scenario illustrated in Figure 1, but with the vehicle 100 seen from an above perspective and wherein a VRU200is depicted.
When the vehicle 100 is driving in the driving direction 105, the camera 110 detects the VRU 200. An image recognition program may recognise the VRU 200 as a VRU and possibly also categorise it as e.g. a pedestrian, child, bicyclist, animal etc.
As the vehicle 100 is driving forward in the driving direction 105 and approaching the VRU 200, the VRU 200 for a moment becomes situated in an area where it is detected both by the camera 110 and the sensor 120. The VRU 200 may then be mapped with the object 200 detected by the sensor 120. Thereby it becomes possible for the sensor 120 to recognise the VRU 200 as a VRU, also when the VRU 200 is stationary.
As the vehicle 100 is advancing in the driving direction 105, the VRU 200 becomes out of sight for the camera 110 while still being situated within range of the sensor 120, as illustrated inFigure 2B.The VRU 200 may then be tracked by the sensor 120 for as long it is situated within detection range of the sensor 120.
An accurate detection and tracking of any VRU 200 in the proximity of the vehicle 100 is the backbone for creating a reliable VRU warning system that only warns/ intervenes when a collision with a VRU is really probable and impending. Such system will gain higher acceptance and trust which in turn is expected to reduce fatalities of turn accidents.
However, the disclosed method for VRU detection is not limited to VRU warning systems, but may be used for various other purposes.
Figure 3illustrates an example of a vehicle interior of the vehicle 100 and depicts how the previously scenario in Figure 1 and/ or Figure 2A may be perceived by the driver of the vehicle 100.
The vehicle 100 comprises a control unit310.The control unit 310 is able to recognise the VRU 200 as a VRU, based on one or more images provided by the camera 110. Further the control unit 310 is configured for receiving detection signals from the sensor 120 and mapping the detected VRU with the detection signals received from the sensor 120. Also, the control unit 310 is further configured for tracking the VRU 200 via the sensor 120, as long as the VRU 200 is within range of the sensor 120.
As illustrated, the vehicle 100 may comprise one sensor 120-1 on the right side of the vehicle 100 and one sensor 120-2 on the left side in some embodiments. However, in other embodiments, the vehicle 100 may comprise only one sensor 120 on the right side of the vehicle 100, thereby reducing the number of sensors 120 in the vehicle 100. However, in other embodiments, the vehicle 100 may comprise a plurality of sensors 120 on each side of the vehicle 100. The sensors 120 may be of the same, or different types, such as e.g. radar, lidar, ultrasound, time-of-flight camera, etc.
In the illustrated example, the vehicle 100 comprises one camera 110 situated in front of the vehicle 100 behind the windscreen. However in other embodiments the vehicle 100 may comprise a camera 110 situated at the rear part of the vehicle 100, directed in a direction opposite to the normal driving direction 105. Thus detection of VRUs 200 may be made while backing the vehicle 100. The camera 110 may in such case be situated inside the rear glass, in order to be protected from dirt, snow, etc.
The control unit 310 may communicate with the camera 110 and sensor 120, e.g. via a communication bus of the vehicle 100, or via a wired or wireless connection.
Figure 4illustrates an example of a method400according to an embodiment. The flow chart in Figure 4 shows the method 400 for use in a vehicle 100. The method 400 aims at detecting and tracking a VRU 200.
The vehicle 100 may be e.g. a truck, a bus, a car, a motorcycle or similar.
In order to correctly be able to detect and track the VRU 200, the method 400 may comprise a number of steps401-405.However, some of these steps 401-405 may be performed in various alternative manners. Further, the described steps 401-405 may be performed in a somewhat different chronological order than the numbering suggests. The method 400 may comprise the subsequent steps: Step 401comprises detecting an object 200 by a camera 110 of the vehicle 100.
Step 402comprises classifying the detected 401 object 200 as a VRU 200.
The classification of the detected 401 object 200 may be made based on image recognition.
The classification may further comprise a movement prediction reliability estimation of the VRU 200, wherein unattended animals and people shorter than a configurable threshold length are classified as having reduced movement prediction reliability.
The classification may in further addition also comprise a movement prediction reliability estimation of the VRU 200, wherein motorcycle drivers are classified as having enhanced movement prediction reliability.
Step 403comprises detecting the object 200 by a sensor 120 of the vehicle 100.
Step 404comprises mapping the classified 402 VRU 200 with the object 200 detected 403 by the sensor 120.
Step 405comprises tracking the VRU 200 by the sensor 120.
Figure 5illustrates an embodiment of a system 500 for detecting and tracking a VRU 200. The system 500 may perform at least some of the previously described steps 401-405 according to the method 400 described above and illustrated in Figure 4.
The system 500 comprises a control unit310in the vehicle 100. The control unit 310 is arranged for detecting and tracking the VRU 200. The control unit 310 is configured for detecting an object 200 by a camera 110 of the vehicle 100. Further the control unit 310 is configured for classifying the detected object 200 as a VRU 200. The control unit 310 is also configured for detecting the object 200 by a sensor 120 of the vehicle 100. In addition the control unit 310 is furthermore also configured for mapping the classified VRU 200 with the object 200 detected by the sensor 120. The control unit 310 is configured for tracking the VRU 200 by the sensor 120.
The control unit 310 may also be configured for classifying the detected object 200 based on image recognition in some embodiments. Further the control unit 310 may be configured for making a movement prediction reliability estimation of the VRU 200, wherein unattended animals and people shorter than a configurable threshold length are classified as having reduced movement prediction reliability. In some embodiments, the control unit 310 may be configured for making a movement prediction reliability estimation of the VRU 200, wherein motorcycle drivers are classified as having enhanced movement prediction reliability.
The control unit 310 comprises a receiving circuit510configured for receiving a signal from the camera 110 and the sensor 120.
Further, the control unit 310 comprises a processor520configured for performing at least some steps of the method 400, according to some embodiments.
Such processor 520 may comprise one or more instances of a processing circuit, i.e. a Central Processing Unit (CPU), a processing unit, a processing circuit, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret and execute instructions. The herein utilised expression "processor" may thus represent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones enumerated above.
Furthermore, the control unit 310 may comprise a memory525in some embodiments. The optional memory 525 may comprise a physical device utilised to store data or programs, i.e., sequences of instructions, on a temporary or permanent basis. According to some embodiments, the memory 525 may comprise integrated circuits comprising silicon-based transistors. The memory 525 may comprise e.g. a memory card, a flash memory, a USB memory, a hard disc, or another similar volatile or non-volatile storage unit for storing data such as e.g. ROM (Read-Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), etc. in different embodiments.
Further, the control unit 310 may comprise a signal transmitter530in some embodiments. The signal transmitter 530 may be configured for transmitting a signal to e.g. a display device, or a VDU warning system or warning device, for example.
In addition the system 500 also comprises a camera 110 of the vehicle 100, for detecting an object 200. The camera 110 may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera or similar. Further the camera 110 may be situated behind the windshield of the vehicle 100, directed forward in the driving direction 105 of the vehicle 100.
Furthermore the system 500 may also comprise at least one sensor 120 of the vehicle 100, for detecting the object 200. The at least one sensor 120 may be situated on a side of the vehicle 100 and may comprise any of a radar, lidar, ultrasonic sensor, time- of -flight camera, or thermal camera in some embodiments.
The camera 110 and the at least one sensor 120 may have overlapping fields of view in some embodiments.
The above described steps 401 -405 to be performed in the vehicle 100 may be implemented through the one or more processors 520 within the control unit 310, together with computer program product for performing at least some of the functions of the steps 401-405. Thus a computer program product, comprising instructions for performing the steps 401-405 in the control unit 310 may perform the method 400 comprising at least some of the steps 401-405 for detecting and tracking the VRU 200, when the computer program is loaded into the one or more processors 520 of the control unit 310.
Further, some embodiments of the invention may comprise a vehicle 100, comprising the control unit 310, configured for predicting a path of a vehicle 100, according to at least some of the steps 401-405.
The computer program product mentioned above may be provided for instance in the form of a data carrier carrying computer program code for performing at least some of the steps 401-405 according to some embodiments when being loaded into the one or more processors 520 of the control unit 310. The data carrier may be, e.g., a hard disk, a CD ROM disc, a memory stick, an optical storage device, a magnetic storage device or any other appropriate medium such as a disk or tape that may hold machine readable data in a non-transitory manner. The computer program product may furthermore be provided as computer program code on a server and downloaded to the control unit 310 remotely, e.g., over an Internet or an intranet connection.
The terminology used in the description of the embodiments as illustrated in the accompanying drawings is not intended to be limiting of the described method 400; the control unit 310; the computer program; the system 500 and/ or the vehicle 100. Various changes, substitutions and/ or alterations may be made, without departing from invention embodiments as defined by the appended claims.
As used herein, the term "and/ or" comprises any and all combinations of one or more of the associated listed items. The term "or" as used herein, is to be interpreted as a mathematical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless expressly stated otherwise. In addition, the singular forms "a", "an" and "the" are to be interpreted as "at least one", thus also possibly comprising a plurality of entities of the same kind, unless expressly stated otherwise. It will be further understood that the terms "includes", "comprises", "including" and/ or "comprising", specifies the presence of stated features, actions, integers, steps, operations, elements, and/ or components, but do not preclude the presence or addition of one or more other features, actions, integers, steps, operations, elements, components, and/ or groups thereof. A single unit such as e.g. a processor may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. A computer program may be stored/ distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms such as via Internet or other wired or wireless communication system.

Claims (8)

1. A method (400) in a vehicle (100) for detecting and tracking a Vulnerable Road User, VRU (200), wherein the method (400) comprises: detecting (401) an object (200) by a camera (110) of the vehicle (100); classifying (402) the detected (401) object (200) as a VRU (200) and making a movement prediction reliability estimation of the VRU (200), wherein unattended animals and people shorter than a configurable threshold length are classified as having reduced movement prediction reliability; detecting (403) the object (200) by a sensor (120) of the vehicle (100); mapping (404) the classified (402) VRU (200) with the object (200) detected (403) by the sensor (120); and tracking (405) the VRU (200) by the sensor (120).
2. The method (400) according to claim 1, wherein the classification (402) of the detected (401) object (200) is made based on image recognition.
3. The method (400) according to any of claim 1 or claim 2, wherein the classification (402) further comprises a movement prediction reliability estimation of the VRU (200), wherein motorcycle drivers are classified as having enhanced movement prediction reliability.
4. A control unit (310) in a vehicle (100), for detecting and tracking a Vulnerable Road User, VRU (200), wherein the control unit (310) is configured for: detecting an object (200) by a camera (110) of the vehicle (100); classifying the detected object (200) as a VRU (200) based on image recognition, wherein unattended animals and people shorter than a configurable threshold length are classified as having reduced movement prediction reliability; detecting the object (200) by a sensor (120) of the vehicle (100); mapping the classified VRU (200) with the object (200) detected by the sensor (120); and tracking the VRU (200) by the sensor (120).
5. A computer program comprising program code for performing a method (400) according to any of claims 1-3 when the computer program is executed in a processor in a control unit (310), according to claim 4.
6. A system (500) for detecting and tracking a Vulnerable Road User, VRU (200), wherein the system (500) comprises: a control unit (310) according to claim 4; a camera (110) of the vehicle (100), for detecting an object (200); and at least one sensor (120) of the vehicle (100), for detecting the object (200).
7. The system (500) according to claim 6, wherein the camera (110) is situated behind the windshield of the vehicle (100), directed forward in the driving direction (105) of the vehicle (100), and the at least one sensor (120) is situated on a side of the vehicle (100) and comprises any of a radar, lidar, ultrasonic sensor, time- of -flight camera, or thermal camera.
8. The system (500) according to any of claim 6 or claim 7, wherein the camera (110) and the at least one sensor (120) have overlapping fields of view.
SE1551087A 2015-08-20 2015-08-20 Method, control unit and a system in a vehicle for detectionof a vulnerable road user SE539846C2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
SE1551087A SE539846C2 (en) 2015-08-20 2015-08-20 Method, control unit and a system in a vehicle for detectionof a vulnerable road user
PCT/SE2016/050762 WO2017030494A1 (en) 2015-08-20 2016-08-16 Method, control unit and system for detecting and tracking vulnerable road users
DE112016003241.2T DE112016003241T5 (en) 2015-08-20 2016-08-16 Method, control unit and system for detecting and tracking vulnerable road users

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1551087A SE539846C2 (en) 2015-08-20 2015-08-20 Method, control unit and a system in a vehicle for detectionof a vulnerable road user

Publications (2)

Publication Number Publication Date
SE1551087A1 SE1551087A1 (en) 2017-02-21
SE539846C2 true SE539846C2 (en) 2017-12-19

Family

ID=58051369

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1551087A SE539846C2 (en) 2015-08-20 2015-08-20 Method, control unit and a system in a vehicle for detectionof a vulnerable road user

Country Status (3)

Country Link
DE (1) DE112016003241T5 (en)
SE (1) SE539846C2 (en)
WO (1) WO2017030494A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2569654B (en) 2017-12-22 2022-09-07 Sportlight Tech Ltd Apparatusses, systems and methods for object tracking
US11257370B2 (en) 2018-03-19 2022-02-22 Derq Inc. Early warning and collision avoidance
CA3148680A1 (en) 2019-08-29 2021-03-04 Derq Inc. Enhanced onboard equipment
DE102020203729A1 (en) 2020-03-23 2021-09-23 Volkswagen Aktiengesellschaft Method for operating a motor vehicle
EP3936416B1 (en) 2020-07-10 2024-06-19 Volvo Truck Corporation Motor vehicle comprising a cab body firewall with a lower cross beam and an upper cross beam
DE102022206123A1 (en) * 2022-06-20 2023-12-21 Robert Bosch Gesellschaft mit beschränkter Haftung Method for determining an approximate object position of a dynamic object, computer program, device and vehicle

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3174832B2 (en) * 1999-10-27 2001-06-11 建設省土木研究所長 Crossing pedestrian collision prevention system
US8725309B2 (en) * 2007-04-02 2014-05-13 Panasonic Corporation Safety driving support apparatus
JP4349452B2 (en) * 2007-08-27 2009-10-21 トヨタ自動車株式会社 Behavior prediction device
DE102008062916A1 (en) * 2008-12-23 2010-06-24 Continental Safety Engineering International Gmbh Method for determining a collision probability of a vehicle with a living being
JP5210233B2 (en) * 2009-04-14 2013-06-12 日立オートモティブシステムズ株式会社 Vehicle external recognition device and vehicle system using the same
US20130278441A1 (en) * 2012-04-24 2013-10-24 Zetta Research and Development, LLC - ForC Series Vehicle proxying
US9196164B1 (en) * 2012-09-27 2015-11-24 Google Inc. Pedestrian notifications

Also Published As

Publication number Publication date
SE1551087A1 (en) 2017-02-21
DE112016003241T5 (en) 2018-05-03
WO2017030494A1 (en) 2017-02-23

Similar Documents

Publication Publication Date Title
US11056002B2 (en) Method, control unit and system for avoiding collision with vulnerable road users
WO2017030494A1 (en) Method, control unit and system for detecting and tracking vulnerable road users
US11541810B2 (en) System for reducing a blind spot for a vehicle
KR102050525B1 (en) Method and control unit to prevent accidents on pedestrian crossings
JP6649865B2 (en) Object detection device
US8471726B2 (en) System and method for collision warning
US20120287276A1 (en) Vision based night-time rear collision warning system, controller, and method of operating the same
CN109415018B (en) Method and control unit for a digital rear view mirror
EP4339648A1 (en) Determining objects of interest for active cruise control
EP1854666B1 (en) System for the detection of objects located in an external front-end zone of a vehicle, which is suitable for industrial vehicles
SE1550100A1 (en) Method, control unit and system for warning
US11407358B2 (en) Method and control unit for rear view
JP6087240B2 (en) Vehicle periphery monitoring device
EP3774500B1 (en) Method for obstacle identification
BR112018001990B1 (en) METHOD ON A VEHICLE, CONTROL UNIT ON A VEHICLE AND SYSTEM FOR AVOIDING A POTENTIAL COLLISION BETWEEN THE VEHICLE AND A VULNERABLE ROAD USER
Hsu et al. Object detection and tracking using an optical time-of-flight range camera module for vehicle safety and driver assist applications
Amir et al. On-Road Approaching Motorcycle Detection and Tracking Techniques: A Survey

Legal Events

Date Code Title Description
NUG Patent has lapsed