WO2017204719A1 - Method for decentralised sensor fusion in a vehicle and sensor fusion system - Google Patents

Method for decentralised sensor fusion in a vehicle and sensor fusion system Download PDF

Info

Publication number
WO2017204719A1
WO2017204719A1 PCT/SE2017/050495 SE2017050495W WO2017204719A1 WO 2017204719 A1 WO2017204719 A1 WO 2017204719A1 SE 2017050495 W SE2017050495 W SE 2017050495W WO 2017204719 A1 WO2017204719 A1 WO 2017204719A1
Authority
WO
WIPO (PCT)
Prior art keywords
objects
identified
identified objects
sensor
same
Prior art date
Application number
PCT/SE2017/050495
Other languages
French (fr)
Inventor
Christian Larsson
Hjalmar LUNDIN
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Publication of WO2017204719A1 publication Critical patent/WO2017204719A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/22Character recognition characterised by the type of writing
    • G06V30/224Character recognition characterised by the type of writing of printed characters having additional code marks or containing code marks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the present disclosure relates to a method for decentralised sensor fusion in a vehicle and a sensor fusion system. It further relates to a vehicle, a computer program and a computer program product.
  • Sensor fusion is used to improve the information about objects which are sensed by more than one sensor. For fusing information from two sources, such as two sensors, it is important to identify which part of the information from the two sources relates to the same object(s) and which part does not. The information which relates to the same object is then fused.
  • sensor fusion system There are two main principles of sensor fusion system: measurement-to-track associations and track-to-track associations.
  • measurement-to-track associations even called centralised sensor fusion or low-level fusion, basically all measurement results from the sensors are transmitted to a fusion centre. In such a transmission of basically all measurement results also noise is included.
  • the fusion centre is then arranged to analyse the measured data and to determine what information could be extracted from the measured data and how this information relates, or partly relates, to the same object(s) or to different objects.
  • track-to-track associations even called decentralised sensor fusion or high-level fusion, the sensors do some analysis of the measured or sensed data.
  • objects can be identified by the sensors. These sensors are sometimes called smart sensors.
  • Only the analysed data is then transmitted to the fusion centre.
  • the fusion centre is then arranged to determine which part of the analysed data from different sensors relates to the same object(s) and which part relates to different objects. Both track-to-track associations and measurement-to-track associations are computationally intensive.
  • Another objective of the present disclosure is to provide an alternative way of performing decentralised sensor fusion.
  • a method for decentralised sensor fusion in a vehicle comprises sensing and identifying objects with a plurality of sensor arrangements. Said sensing and identifying is performed independently by each sensor arrangement out of said plurality of sensor arrangements so that each sensor arrangement provides a respective set of identified objects.
  • the method further comprises for each sensor arrangement out of said plurality of sensor arrangements: associating, by said sensor arrangement, data to said identified objects in said respective set of identified objects.
  • the method even further comprises combining the identified objects from all said respective sets of identified objects based on said associated data so that same objects between the different respective sets of identified objects are identified and so that a common set of objects is provided.
  • Said associated data comprises value(s) of at least one non-kinematic attribute which is provided by the corresponding sensor arrangement.
  • Said value(s) of said at least one non-kinematic attribute is/are intended to remain the same for the same object identified by the same sensor arrangement during consecutive runs of the method.
  • Combining said identified objects from all said respective sets of identified objects comprises first combining identified objects out of different respective sets whose value(s) of said at least one non- kinematic attribute had been combined previously and had been identified as same objects before combining other identified objects from said respective sets. This has the advantage that the steps of gating and associating objects to each other can be skipped for certain objects in a decentralised sensor fusion method.
  • said non-kinematic attribute is an ID-tag. This provides an easy way of associating a non-kinematic attribute to an object.
  • the method is repeatedly performed during operation of the vehicle. This allows for constantly providing input data to autonomous or assisted systems for vehicle driving.
  • said first combining of identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects is only performed for objects which have been identified as the same object for a pre-determined number of previous runs of the method. This allows increasing the quality of the method by sorting out accidental
  • said first combining of identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects is only performed for objects for which a plausibility check indicates that identifying these objects as the same object again is plausible.
  • This allows increasing the quality of the method by sorting out non-justified identifications of objects in different sets as the same physical object.
  • the sensor fusion system comprises a plurality of sensor arrangements.
  • Each sensor arrangement out of the plurality of sensor arrangements is arranged to independently sense and identify objects, to provide a respective set of identified objects, to associate data to said identified objects in said respective set of identified objects, and to transmit said respective set of identified objects and said associated data to a fusion centre arrangement.
  • the sensor fusion system further comprises a fusion centre arrangement.
  • the fusion centre arrangement is arranged to combine the identified objects from all said respective sets of identified objects based on said associated data so that same objects between the different respective sets of identified objects are identified and so that a common set of objects is provided.
  • Said associated data comprises value(s) of at least one non-kinematic attribute, wherein said value(s) of said at least one non-kinematic attribute is/are intended to remain the same for the same object identified by the same sensor arrangement during consecutive providing of a respective set of identified objects and associating data to said identified objects in said respective set of identified objects by each sensor arrangement.
  • the fusion centre arrangement is arranged to, when combining said identified objects from all said respective sets of identified objects, first combining identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects before combining other identified objects from said respective sets.
  • said non-kinematic attribute is an ID-tag.
  • said fusion centre arrangement is arranged to first combine the identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects only for objects which have been identified as the same object for a pre-determined number of times.
  • the fusion centre arrangement further is arranged to perform plausibility checks and to perform said first combining of identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects only for objects for which the plausibility check indicates that identifying these objects as the same object again is plausible.
  • At least some of the objectives are achieved by a vehicle comprising a sensor fusion system according to the present disclosure.
  • a computer program for decentralised sensor fusion in a vehicle comprises program code for causing an electronic control unit or a computer connected to the electronic control unit to perform the steps according to the method of the present disclosure.
  • a computer program product containing a program code stored on a computer-readable medium for performing the method according to the present disclosure, when said computer program is run on an electronic control unit or a computer connected to the electronic control unit.
  • the system, the vehicle, the computer program and the computer program product have corresponding advantages as have been described in connection with the corresponding examples of the method according to this disclosure.
  • Fig. 1 shows, in a schematic way, a vehicle according to one embodiment of the present invention
  • Fig. 2 shows, in a schematic way, a sensor fusion system according to one embodiment of the present invention
  • Fig. 3a shows, in a schematic way, an example of sensor fusion according to prior art
  • Fig. 3b shows, in a schematic way, an example of sensor fusion according to the present disclosure
  • Fig. 4 shows, in a schematic way, a flow chart over an example of a method according to the present invention.
  • Fig. 5 shows, in a schematic way, a device which can be used in connection with the present invention.
  • Fig. 1 shows a side view of a vehicle 100.
  • the vehicle comprises a tractor unit 110 and a trailer unit 112.
  • the vehicle 100 can be a heavy vehicle such as a truck. In one example, no trailer unit is connected to the vehicle 100.
  • the vehicle 100 comprises a sensor fusion system 299, see Fig. 2.
  • the sensor fusion system 299 can be arranged in the tractor unit 110.
  • the vehicle 100 is a bus.
  • the vehicle 100 can be any kind of vehicle.
  • Other examples of vehicles are boats, passenger cars, construction vehicles, and locomotives.
  • link refers to a communication link which may be a physical connection such as an opto-electronic communication line, or a non-physical connection such as a wireless connection, e.g. a radio link or microwave link.
  • Fig. 2 shows schematically an embodiment of a sensor fusion system 299.
  • Said sensor fusion system 299 comprises a plurality of sensor arrangements 220a, 220b, 220c, ...
  • Said plurality of sensor arrangements 220a, 220b, 220c, ... comprises at least a first sensor arrangement 220a and a second sensor arrangement 220b.
  • said plurality of sensor arrangements 220a, 220b, 220c, ... comprises a third sensor arrangement 220c.
  • Said first sensor arrangement 220a is arranged to sense and identify objects.
  • Said objects are preferably objects outside the vehicle in which the sensor fusion system 299 is provided.
  • Said first sensor arrangement 220a can comprise a camera.
  • the camera is in one example arranged to detect visible light.
  • Said visible light can have been emitted or reflected by objects.
  • the camera is in one example arranged to detect infrared light.
  • the camera can be arranged to detect any kind of electromagnetic radiation.
  • the camera is arranged to provide images of the surrounding of the vehicle.
  • Said first sensor arrangement can comprise an analysing unit for the camera.
  • Said analysing unit for the camera is arranged to identify objects in the images which are provided by the camera.
  • Said first sensor arrangement 220a is arranged to provide a first set of identified objects.
  • the set of identified objects comprises the objects which have been sensed by the camera and identified by the analysing unit for the camera.
  • a set of identified objects by a sensor arrangement can comprise any number of identified objects. In one example, no object is identified. In one example one object is identified. In one example more than one object is identified.
  • Said first sensor arrangement 220a is arranged to associate data to said identified objects in said first set of identified objects. Said data can comprise the position of said identified objects. The position of an identified object is in one example provided in one dimension, for example as a distance to the sensor.
  • the position of an identified object is in one example provided in two dimensions, for example as a distance to the sensor and an angle in relation to the sensor.
  • the position of an identified object is in one example provided in three dimensions, for example as a distance to the sensor and two angles in relation to the sensor.
  • the position can also be provided in a one-, two-, or three-dimensional coordinate system, such as a Cartesian coordinate system.
  • Said associated data can comprise a velocity of the object.
  • Said velocity can be one-, two-, or three-dimensional.
  • Said associated data comprises value(s) of at least one non-kinematic attribute, NKA.
  • non-kinematic attributes are anyone of an ID-tag, a kind of object, a colour of the object, a size of the object or the like.
  • the first sensor arrangement 220a can be arranged to associate a number as a value for an ID-tag as a NKA to each identified object.
  • the first sensor arrangement 220a is arranged to associate a colour value as a NKA to each identified object.
  • the first sensor arrangement 220a is arranged to associate a value for the kind of object as a NKA to each identified object. Examples of values for kind of objects are bus, passenger car, truck, bicycle, human being, infrastructure, or the like.
  • Said value(s) of said at least one non-kinematic attribute is/are intended to remain the same for the same object during consecutive providing of a first set of identified objects and associating data to said identified objects in said first set of identified objects by said first sensor arrangement 220a.
  • the type of object, and/or the colour of the object are intended to remain the same. This is due to the fact that objects typically do not change colour or type. However, accidentally this might of course happen, for example, when an camera first is exposed to direct sunlight and then enters a shadowed area, so that the appearance of the colour of the object can change in the camera.
  • the ID-tag of the object or the size of the object are examples of values which are intended to remain the same.
  • Said first sensor arrangement 220a is performed to operate independently of the other sensor arrangements in the plurality of sensor arrangements 220a, 220b, 220c, ...
  • Said second sensor arrangement 220b is arranged to perform basically the same tasks as the first sensor arrangement 220a.
  • said second sensor arrangement 220b is arranged to sense and identify objects.
  • Said second sensor arrangement 220b can comprise a radar arrangement.
  • the radar arrangement is in one example arranged to detect reflected radar beams from objects.
  • the radar arrangement is arranged to provide radar images of the surrounding of the vehicle.
  • Said second sensor arrangement 220b can comprise an analysing unit for the radar arrangement.
  • Said analysing unit for the radar arrangement is arranged to identify objects in the radar images which are provided by the radar arrangement.
  • Said second sensor arrangement 220b is arranged to provide a second set of identified objects.
  • the second set of identified objects comprises the objects which have been sensed by the radar arrangement and identified by the analysing unit for the radar arrangement.
  • Said second sensor arrangement 220b is arranged to associate data to said identified objects in said second set of identified objects.
  • the second sensor arrangement 220b is arranged to perform this in a way corresponding to what has been described in relation to the first sensor arrangement 220a and the first set of identified objects.
  • the first set of identified objects and the second set of identified objects either can contain the same objects, different objects, or partly the same and partly different objects. That the objects, at least partly, can be different objects is in one example due to the fact that the first and the second sensor arrangement 220a and 220b are arranged to sense different parts of the surrounding of the vehicle. Thus, an object visible in the field of view or the sensing range of the first sensor arrangement might not necessarily be visible in the field of view or the sensing range of the second sensor arrangement. That the objects, at least partly, can be different objects is in one example due to the fact that an object which can be sensed by a camera arrangement is not necessarily sensible by the radar arrangement, and vice versa. Said second sensor arrangement 220b is performed to operate independently of the other sensor arrangements in the plurality of sensor arrangements 220a, 220b, 220c, ...
  • Said third sensor arrangement 220c is arranged to perform basically the same tasks as the first sensor arrangement 220a and the second sensor arrangement 220b.
  • said third sensor arrangement 220c is arranged to sense and identify objects.
  • Said third sensor arrangement 220c can comprise a lidar arrangement.
  • the lidar arrangement is in one example arranged to detect reflected laser beams from objects.
  • the lidar arrangement is arranged to provide images of the surrounding of the vehicle based on detection of the reflected laser beams.
  • Said third sensor arrangement 220b can comprise an analysing unit for the lidar arrangement.
  • Said analysing unit for the lidar arrangement is arranged to identify objects in the images which are provided by the lidar arrangement.
  • Said third sensor arrangement 220c is arranged to provide a third set of identified objects.
  • the third set of identified objects comprises the objects which have been sensed by the lidar arrangement and identified by the analysing unit for the lidar arrangement.
  • Said third sensor arrangement 220c is arranged to associate data to said identified objects in said third set of identified objects.
  • the third sensor arrangement 220c is arranged to perform this in a way corresponding to what has been described in relation to the first sensor arrangement 220a and the first set of identified objects.
  • Said third sensor arrangement 220c is performed to operate independently of the other sensor arrangements in the plurality of sensor arrangements 220a, 220b, 220c, ...
  • the sensor fusion system 299 comprises a fusion centre arrangement.
  • Said fusion centre arrangement can be a first control unit 200.
  • the first sensor arrangement 220a is arranged to transmit said first set of identified objects and said associated data to the first control unit 200.
  • Said first control unit 200 can be arranged to control operation of said first sensor arrangement 220a.
  • Said first control unit 200 is arranged for communication with said first sensor arrangement 220a via a link L220a.
  • Said first control unit 200 is arranged to receive information from said first sensor arrangement 220a.
  • the second sensor arrangement 220b is arranged to transmit said first set of identified objects and said associated data to the first control unit 200.
  • Said first control unit 200 can be arranged to control operation of said second sensor arrangement 220b.
  • Said first control unit 200 is arranged for communication with said second sensor arrangement 220b via a link L220b.
  • Said first control unit 200 is arranged to receive information from said second sensor arrangement 220b.
  • the third sensor arrangement 220c is arranged to transmit said first set of identified objects and said associated data to the first control unit 200.
  • Said first control unit 200 can be arranged to control operation of said third sensor arrangement 220c.
  • Said first control unit 200 is arranged for communication with said third sensor arrangement 220c via a link L220c.
  • Said first control unit 200 is arranged to receive information from said third sensor arrangement 220c.
  • the transmission is performed over a CAN-bus of the vehicle.
  • the first control unit 200 is arranged to combine the identified objects from the first, second, and third set of identified objects based on the associated data.
  • the first control unit 200 is arranged to perform the combining so that same objects between the different sets of identified objects are identified.
  • the first control unit 200 is arranged to perform the combining so that a common set of objects is provided.
  • the plurality of sensor arrangements 220a, 220b, 220c, ... and the first control unit 200 are preferably arranged to repeatedly perform the actions that they are arranged for.
  • the sensing and identifying, the transmitting and the combining can be performed approximately every 17 milliseconds. Any other time period for the repetition is possible as well.
  • the first control unit 200 is arranged to, when combining said identified objects from the first, second, and third set of identified objects, first combining identified objects out of the first, second, and third set whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects before combining other identified objects from said first, second and third set. This is explained in more detail in relation to Fig. 3b. So far only three sensor arrangements have been described. In general, however, more than the described three sensor arrangements are possible to be part of the plurality of sensor arrangements 220a, 220b, 220c, ... Any of the described first, second, or third sensors could be provided several times.
  • the sensor fusion system 299 could be provided with a left side radar and a right side radar. Both the left and the right side radar could be arranged to operate as the described second sensor arrangement. Even several cameras, several lidars, or the like can be provided. Even other types of sensor arrangements can be provided as part of the plurality of sensor arrangements. Examples of such sensor arrangements are sensors arranged to receive information from other vehicles or from infrastructure. In one example, a sensor for receiving vehicle-to-vehicle communication is provided. Other vehicle can send information regarding their position, for example achieved via a GPS-receiver, and/or their type of vehicle to the sensor. Another example is a sensor for receiving infrastructure-to-vehicle communication.
  • the sensor could be arranged to receive information regarding road signs, such as speed limits, status of a traffic light, or the like.
  • the first control unit 200 is in one example arranged to transmit information regarding the common set of objects to a device array 270.
  • Said first control unit 200 can be arranged to control operation of said device array 270.
  • Said first control unit 200 is arranged for communication with said device array 270 via a link L270.
  • Said first control unit 200 can be arranged to receive information from said device array 270.
  • the device array 270 is arranged to receive the transmitted information form the first control unit 200.
  • the device array 270 can comprise one or more devices 271-276, ...
  • the device array 270 comprises an adaptive cruise control device 271.
  • the device array 270 comprises a warning system for pedestrians and/or bicyclists 272. In one example, the device array 270 comprises a blind spot warning device 273. In one example, the device array 270 comprises an automatic or assisted queue driving system 274. In one example, the device array 270 comprises an automatic or assisted overtaking system 275. In one example, the device array 270 comprises an automatic or assisted reversing system 276. Said received information can then be provided as input information to any of the device(s) in the device array 270. In principle, any device or system which needs information regarding surrounding objects of the vehicle can be part of the device array 270. In one example, a screen is part of the device array 270.
  • a second control unit 205 is arranged for communication with the first control unit 200 via a link L205 and may be detachably connected to it. It may be a control unit external to the vehicle 100. It may be adapted to conducting the innovative method steps according to the invention.
  • the second control unit 205 may be arranged to perform the inventive method steps according to the invention. It may be used to cross-load software to the first control unit 200, particularly software for conducting the innovative method. It may alternatively be arranged for communication with the first control unit 200 via an internal network on board the vehicle. It may be adapted to performing substantially the same functions as the first control unit 200, such as adapting the control of the gas engine in a vehicle.
  • the innovative method may be conducted by the first control unit 200 or the second control unit 205, or by both of them.
  • the sensor fusion system 299 can perform any of the method steps described later in relation to Fig. 4.
  • Fig. 3a shows, in a schematic way, an example of sensor fusion according to prior art.
  • This relates to the fusion of the sensor data relating to this object.
  • Two sets of identified objects are provided.
  • a first set of identified objects comprises identified objects 300-305.
  • the first set is indicated by circles filled with black colour.
  • a second set of identified objects comprises identified objects 310- 315.
  • the second set is indicated by circles filled with white colour.
  • An object of sensor fusion is to combine the two sets of identified objects so that a common set of identified objects is combined. Especially the same physical objects between the identified objects in the first and the second set have to be found.
  • a first identified object 300 of the first set is tried to match to all identified objects 310-315 of the second set. This is indicated by continuous lines.
  • a second identified object 301 of the first set is tried to match to all identified objects 310-315 of the second set. This is indicated by dashed lines. This procedure continuous until all objects in the first set and the second set are checked against each other whether they refer to the same physical object.
  • a last identified object 305 of the first set is tried to match to all identified objects 310-315 of the second set. This is indicated by dotted lines.
  • the other identified objects of the first set are also tried to match to the identified objects of the second set.
  • the lines indicating these matchings have been omitted for not overloading the figure.
  • the matching could, for example, include calculating the probability that an object from the first set refers to the same physical object as an object from the second set. The two objects with the highest probability of matching are then matched together. Some thresholds or the like can be used to avoid matching objects which in reality do not correspond to the same physical object and which have not been identified in both sets.
  • Fig. 3b shows, in a schematic way, an example of sensor fusion according to the present invention.
  • Two sets of identified objects are provided.
  • a first set of identified objects comprises identified objects 300-305.
  • the first set is indicated by circles filled with black colour.
  • a second set of identified objects comprises identified objects 310-315.
  • the second set is indicated by circles filled with white colour.
  • the identified objects can be the same as in the prior art solution. However, all identified objects will have a respective NKA associated to them, for example an ID-tag.
  • the identified objects 300-305 of the first set could have Al, A5, A13, A3, A127, and A21 as values for the respective ID-tag.
  • the identified objects 310-315 of the second set could have B3, B69, B338, B15, B7, and B28 as values for the respective ID-tag.
  • the values for the NKA are provided independently by the two sensor arrangements, i.e. the first sensor arrangement providing the first set of identified objects does in general not "know" what values for the NKA are provided by the second sensor arrangement, and vice versa.
  • the sensor providing the first set and the sensor providing the second set are so-called smart sensors or so-called tracking sensors.
  • the sensor arrangement providing the first set is arranged to provide the same value of the NKA to the same sensed and identified physical object every time a sensing and identifying process is performed.
  • the sensor arrangement providing the second set is arranged to provide the same value of the NKA to the same sensed and identified physical object every time a sensing and identifying process is performed.
  • the first sensor arrangement is arranged to associate the value Al again to the object representing the pedestrian next time the sensor arrangement senses and identifies objects, i.e. at the next run of a method according to the present disclosure.
  • the specific pedestrian does not need to be the first identified object but is, for example, the third identified object.
  • the value Al of the ID-tag will be associated to the third identified object at the next run.
  • the same specific pedestrian as discussed in connection to the first sensor arrangement is identified as the third object and receives, as an example, B338 as value for the ID-tag.
  • the fusion centre arrangement When trying to match the objects from the two sets, the fusion centre arrangement will first analyse whether an object with a first value of the NKA of the first set has been matched to an object with a second value of the NKA of the second set before, i.e. at a previous run of the method. As an example, the fusion centre arrangement will receive Al as the value for the ID- tag for the first identified object of the first set. The fusion centre arrangement will then analyse whether Al was matched to a value for the ID-tag in the second set before. The fusion centre arrangement might find that Al has been matched to B338 before and will then find that B338 belongs to the third identified object of the second set. The fusion centre will then match these two objects before matching any objects which have no previously used
  • Fig. 4 shows, in a schematic way, a flow chart over an example of a method 400 for decentralised sensor fusion in a vehicle according to the present invention. Not all steps of the described method are necessary. Instead, the method is presented in a way in which it might be implemented in a vehicle. However, several of the steps are optional and can thus be omitted in other examples of the method. Which steps are optional will in general depend on the specific implementation on a specific vehicle.
  • the method starts with a step 410 of sensing and identifying objects.
  • Said sensing can be performed by camera arrangements, radar arrangements, lidar arrangements or the like as described in relation to Fig. 2.
  • Said sensing can also be performed by receiving information which is send out from other vehicles or infrastructure as described in relation to Fig. 2.
  • Step 410 is performed independently by a plurality of sensor arrangements. Each sensor arrangement is arranged to identify objects based on the data it receives through the sensing. The sensing and identifying of the objects is performed by each sensor arrangement in such a way that a set of identified objects is provided. Step 410 can be performed repeatedly.
  • the method continues with step 420.
  • step 420 data is associated to the identified objects in the set of objects. This is performed independently by each sensor for the respective set of identified objects.
  • Said associated data can comprise position and velocity of the objects as described before.
  • Said associated data comprises value(s) of at least one non-kinematic attribute, NKA.
  • Said value(s) of the at least one NKA is/are independently provided by each sensor arrangement.
  • Said value(s) of said at least one NKA is/are intended to remain the same for the same object identified by the same sensor arrangement during consecutive runs of the method 400. As an example, in case a first car is identified by a first sensor arrangement, it is intended that the same value of the NKA always is associated to said first car. This can be achieved in a number of different ways.
  • the sensor arrangement can be arranged to analyse the ID-numbers of other cars and always associate the same value of the NKA to cars sending out the same ID-number.
  • a sensor arrangement comprises a camera which provides images. The provided images can be analysed by the sensor arrangement and values for the NKA can be associated to each identified object in the image.
  • the sensor arrangement typically is arranged to repeat step 410 in the order of some tens of milliseconds. Typical objects surrounding a vehicle will not move very far during such a time period. Step 420 can be performed repeatedly.
  • the sensor arrangement can then be arranged to associate the same value for the NKA to each identified object which is basically on the same place in the provided image on a next run of the method 400.
  • sensor arrangements can keep track of identified objects. Basically any such sensor arrangement can be used to associate the same value of the NKA to the same identified object in a next run of step 420.
  • the method continues with step 430.
  • step 430 the set of identified objects and the associated data is transmitted to a fusion centre arrangement.
  • Said fusion centre arrangement can be an electronic control unit as described in relation to Fig. 2.
  • Said fusion centre arrangement can be an existing electronic control unit in the vehicle.
  • Said electronic control unit can be situated at a distance from the sensor arrangement.
  • said fusion centre arrangement is implemented at one sensor arrangement out of the plurality of sensor arrangements.
  • the method continues with step 440. All of the following steps are preferably performed by the fusion centre
  • step 440 it is analysed which value(s) of said at least one non-kinematic attribute of the identified objects out of the different sets had been combined previously and had been identified as same objects. This can be done in the following way: For an object in a first set of identified objects the value(s) of the NKA is/are determined.
  • first value(s) of the NKA can be denoted first value(s) of the NKA. It is analysed whether that object of the first set previously has been identified as representing the same physical object as an identified object in any of the other sets of identified objects. This can be done by analysing, for example in a look-up table, whether said first value(s) of the NKA had previously been combined with second value(s) of the NKA attributed to identified objects in any other set of identified objects. It can then be analysed whether any of said second value(s) of the NKA are present in any other of the present sets of identified objects. If this is the case, it can be concluded that the object(s) in the other sets of identified objects having said second value(s) of the NKA still relate to the same physical object.
  • step 490 is performed before continuing with step 470 for the objects of the different sets of identified objects which have been identified as representing the same physical object.
  • the above described can be done for all identified objects of all sets of identified objects.
  • said pre-determined number of times can be two, three, four, five, or more times. Only if these objects have been analysed as representing the same physical object for at least said pre-determined number of times the method continues with step 470 for these objects.
  • step 450 for the objects which do not continue with step 490 or step 470.
  • step 450 the data relating to the remaining objects is gated. This is performed in accordance to any gating process in combination to sensor fusion which is known in the art. The method continues with step 460.
  • step 460 an association algorithm is performed. This algorithm is intended to find identified objects representing the same physical object among the remaining objects in the sets of identified objects. Such association algorithms are well known in the art.
  • step 470 In one example, the order of the steps 460 and 470 is reversed.
  • step 470 data relating to the same physical object is fused.
  • the process of data fusion for data relating to the same physical object is known in the art. That data relates to the same physical object is known either from step 460 or from step 440, or from step 490, respectively.
  • step 480 In the optional step 490 it is analysed whether it is plausible that identified objects from at least two sets of identified objects really represent the same physical object.
  • Such a plausibility test can be performed in a number of ways. In one example, it is analysed whether the identified objects have approximately the same velocity. In one example, it is analysed whether the identified objects have approximately the same position. In one example, it is analysed whether the identified objects are of the same type, have approximately the same colour, approximately the same size, and/or have approximately the same values for any other parameter.
  • a plausibility test can also comprise a combination of any of the above examples. When referring to the term approximately it should be understood that every sensor arrangement has a certain degree of uncertainty in determining data or parameter relating to the identified objects.
  • the determined velocity will have a certain degree of uncertainty.
  • the degree of uncertainty can differ between different sensor arrangements.
  • a first sensor arrangement could have determined the velocity of an object as 16 km/h ⁇ 4 km/h and a second sensor arrangement could have determined the velocity of an object as 22 km/h ⁇ 3 km/h. It can then be determined that the values for the velocities partly overlap when including the uncertainty and that it therefore is plausible that the identified objects represent the same physical object.
  • More complex algorithms known in the art can be used as well.
  • step 470 for these objects.
  • step 450 for these objects.
  • step 480 a track management procedure is performed.
  • This track management procedure can include storing information regarding which of the identified objects from the different sets of identified objects have been determined to represent the same physical object.
  • This track management procedure can include storing the values of the NKA for the identified objects which have been determined to represent the same physical object.
  • the method 400 ends after step 480.
  • the method 400 starts again.
  • the method 400 is performed repeatedly during operation of the vehicle.
  • FIG. 5 is a diagram of one version of a device 500.
  • the control units 200 and 205 described with reference to Figure 2 may in one version comprise the device 500.
  • the device 500 comprises a non-volatile memory 520, a data processing unit 510 and a read/write memory 550.
  • the non-volatile memory 520 has a first memory element 530 in which a computer program, e.g. an operating system, is stored for controlling the function of the device 500.
  • the device 500 further comprises a bus controller, a serial communication port, I/O means, an A/D converter, a time and date input and transfer unit, an event counter and an interruption controller (not depicted).
  • the non-volatile memory 520 has also a second memory element 540.
  • the computer program comprises routines for controlling a gas engine, wherein the gas engine is supplied with a fuel gas which consists of different kinds of molecules and which is stored in at least a gaseous phase and a liquid phase in a gas storage device.
  • the computer program P may comprise routines for decentralised sensor fusion in a vehicle. This may at least partly be performed by means of said first control unit 200 and said plurality of sensor arrangement 220a, 220b, 220c, ...
  • the computer program P may comprise routines sensing and identifying objects with a plurality of sensor arrangements.
  • the computer program P may comprise routines for associate data to said identified objects in the sets of identified objects. Said associated data comprises value(s) of at least one non-kinematic attribute. This might at least partly be performed by said plurality of sensor arrangements 220a, 220b, 220c, ... Said associated data, especially said value(s) for the NKA can be stored in said non-volatile memory 520.
  • the computer program P may comprise routines for combining the identified objects from all said respective sets of identified objects based on said associated data so that same objects between the different respective sets of identified objects are identified and so that a common set of objects is provided. This may at least partly be performed by means of said first control unit 200.
  • the computer program P may comprise routines first combining identified objects out of different sets of identified objects whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects before combining other identified objects from said respective sets. This may at least partly be performed by means of said first control unit 200, for example based on accessing stored values for the NKA from said non-volatile memory 520.
  • the computer program P may comprise routines for determining whether objects have been identified as the same object for a pre-determined number of previous runs of the method. This may at least partly be performed by means of said first control unit.
  • the computer program P may comprise routines for determining whether an identification of objects as the same physical object is plausible. This may at least partly be performed by means of said first control unit 200.
  • the program P may be stored in an executable form or in compressed form in a memory 560 and/or in a read/write memory 550.
  • the data processing unit 510 performs a certain function, it means that it conducts a certain part of the program which is stored in the memory 560 or a certain part of the program which is stored in the read/write memory 550.
  • the data processing device 510 can communicate with a data port 599 via a data bus 515.
  • the non-volatile memory 520 is intended for communication with the data processing unit 510 via a data bus 512.
  • the separate memory 560 is intended to communicate with the data processing unit via a data bus 511.
  • the read/write memory 550 is arranged to communicate with the data processing unit 510 via a data bus 514.
  • the links L205, L220, L240, L250, and L270, for example, may be connected to the data port 599 (see Figure 2).
  • the data processing unit 510 When data are received on the data port 599, they can be stored temporarily in the second memory element 540. When input data received have been temporarily stored, the data processing unit 510 can be prepared to conduct code execution as described above.
  • Parts of the methods herein described may be conducted by the device 500 by means of the data processing unit 510 which runs the program stored in the memory 560 or the read/write memory 550. When the device 500 runs the program, methods herein described are executed.
  • system according to the present disclosure can be arranged to perform any of the steps or actions described in relation to the method 400. It should also be understood that the method according to the present disclosure can further comprise any of the actions attributed to an element of the sensor fusion system 299 described in relation to Fig. 2. The same applies to the computer program and the computer program product.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present invention relates to a method (400) and a system (299) for decentralised sensor fusion providing object association with optimized computation load. The method and system comprises a plurality of sensor arrangements (220a, 220b, 220c,...). Each sensor arrangement is arranged to independently sense and identify objects, to provide a respective set of identified objects, to associate data comprising at least one non-kinematic attribute to said identified objects in said respective set of identified objects, and to transmit said respective set of identified objects and said associated data to a fusion centre arrangement (200, 205). The fusion centre arrangement is arranged to combine (440-470) the identified objects from all said respective sets of identified objects based on said associated data so that same objects between the different respective sets of identified objects are identified and so that a common set of objects is provided. The combination of data is obtained by first combining identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects before combining other identified objects from said respective sets.

Description

Method for decentralised sensor fusion in a vehicle and sensor fusion system
TECHNICAL FIELD
The present disclosure relates to a method for decentralised sensor fusion in a vehicle and a sensor fusion system. It further relates to a vehicle, a computer program and a computer program product.
BACKGROUND ART
Sensor fusion is used to improve the information about objects which are sensed by more than one sensor. For fusing information from two sources, such as two sensors, it is important to identify which part of the information from the two sources relates to the same object(s) and which part does not. The information which relates to the same object is then fused. There are two main principles of sensor fusion system: measurement-to-track associations and track-to-track associations. In measurement-to-track associations, even called centralised sensor fusion or low-level fusion, basically all measurement results from the sensors are transmitted to a fusion centre. In such a transmission of basically all measurement results also noise is included. The fusion centre is then arranged to analyse the measured data and to determine what information could be extracted from the measured data and how this information relates, or partly relates, to the same object(s) or to different objects.
In track-to-track associations, even called decentralised sensor fusion or high-level fusion, the sensors do some analysis of the measured or sensed data. As an example, objects can be identified by the sensors. These sensors are sometimes called smart sensors. Only the analysed data is then transmitted to the fusion centre. The fusion centre is then arranged to determine which part of the analysed data from different sensors relates to the same object(s) and which part relates to different objects. Both track-to-track associations and measurement-to-track associations are computationally intensive.
SUMMARY OF THE INVENTION
It is thus an objective of the present disclosure to provide a less computational intensive decentralised sensor fusion. Another objective of the present disclosure is to provide an alternative way of performing decentralised sensor fusion.
At least some of the objectives are achieved by a method for decentralised sensor fusion in a vehicle. The method comprises sensing and identifying objects with a plurality of sensor arrangements. Said sensing and identifying is performed independently by each sensor arrangement out of said plurality of sensor arrangements so that each sensor arrangement provides a respective set of identified objects. The method further comprises for each sensor arrangement out of said plurality of sensor arrangements: associating, by said sensor arrangement, data to said identified objects in said respective set of identified objects. The method even further comprises combining the identified objects from all said respective sets of identified objects based on said associated data so that same objects between the different respective sets of identified objects are identified and so that a common set of objects is provided. Said associated data comprises value(s) of at least one non-kinematic attribute which is provided by the corresponding sensor arrangement. Said value(s) of said at least one non-kinematic attribute is/are intended to remain the same for the same object identified by the same sensor arrangement during consecutive runs of the method. Combining said identified objects from all said respective sets of identified objects comprises first combining identified objects out of different respective sets whose value(s) of said at least one non- kinematic attribute had been combined previously and had been identified as same objects before combining other identified objects from said respective sets. This has the advantage that the steps of gating and associating objects to each other can be skipped for certain objects in a decentralised sensor fusion method. Since gating and association are computationally intensive, this results in the fact that sensor fusion can be performed faster, and/or on a less complex hardware, and/or with more sensors included. As a consequence either hardware cost can be lowered and/or the quality of the information available from a plurality of sensors can be increased. This might especially useful in increasing the performance of autonomous or assisted systems for vehicle driving.
In one example of the method, said non-kinematic attribute is an ID-tag. This provides an easy way of associating a non-kinematic attribute to an object. In one example, the method is repeatedly performed during operation of the vehicle. This allows for constantly providing input data to autonomous or assisted systems for vehicle driving.
In one example of the method, said first combining of identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects is only performed for objects which have been identified as the same object for a pre-determined number of previous runs of the method. This allows increasing the quality of the method by sorting out accidental
identification of objects in different sets as the same physical object.
In one example, said first combining of identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects is only performed for objects for which a plausibility check indicates that identifying these objects as the same object again is plausible. This allows increasing the quality of the method by sorting out non-justified identifications of objects in different sets as the same physical object. At least some of the objectives are achieved by a sensor fusion system. The sensor fusion system comprises a plurality of sensor arrangements. Each sensor arrangement out of the plurality of sensor arrangements is arranged to independently sense and identify objects, to provide a respective set of identified objects, to associate data to said identified objects in said respective set of identified objects, and to transmit said respective set of identified objects and said associated data to a fusion centre arrangement. The sensor fusion system further comprises a fusion centre arrangement. The fusion centre arrangement is arranged to combine the identified objects from all said respective sets of identified objects based on said associated data so that same objects between the different respective sets of identified objects are identified and so that a common set of objects is provided. Said associated data comprises value(s) of at least one non-kinematic attribute, wherein said value(s) of said at least one non-kinematic attribute is/are intended to remain the same for the same object identified by the same sensor arrangement during consecutive providing of a respective set of identified objects and associating data to said identified objects in said respective set of identified objects by each sensor arrangement. The fusion centre arrangement is arranged to, when combining said identified objects from all said respective sets of identified objects, first combining identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects before combining other identified objects from said respective sets. In one embodiment of the system said non-kinematic attribute is an ID-tag.
In one embodiment, said fusion centre arrangement is arranged to first combine the identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects only for objects which have been identified as the same object for a pre-determined number of times. In one embodiment, the fusion centre arrangement further is arranged to perform plausibility checks and to perform said first combining of identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects only for objects for which the plausibility check indicates that identifying these objects as the same object again is plausible. At least some of the objectives are achieved by a vehicle comprising a sensor fusion system according to the present disclosure.
At least some of the objectives are achieved by a computer program for decentralised sensor fusion in a vehicle. Said computer program comprises program code for causing an electronic control unit or a computer connected to the electronic control unit to perform the steps according to the method of the present disclosure.
At least some of the objects are also achieved by a computer program product containing a program code stored on a computer-readable medium for performing the method according to the present disclosure, when said computer program is run on an electronic control unit or a computer connected to the electronic control unit. The system, the vehicle, the computer program and the computer program product have corresponding advantages as have been described in connection with the corresponding examples of the method according to this disclosure.
Further advantages of the present invention are described in the following detailed
description and/or will arise to a person skilled in the art when performing the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
For a more detailed understanding of the present invention and its objects and advantages, reference is made to the following detailed description which should be read together with the accompanying drawings. Same reference numbers refer to same components in the different figures. In the following,
Fig. 1 shows, in a schematic way, a vehicle according to one embodiment of the present invention;
Fig. 2 shows, in a schematic way, a sensor fusion system according to one embodiment of the present invention;
Fig. 3a shows, in a schematic way, an example of sensor fusion according to prior art;
Fig. 3b shows, in a schematic way, an example of sensor fusion according to the present disclosure;
Fig. 4 shows, in a schematic way, a flow chart over an example of a method according to the present invention; and
Fig. 5 shows, in a schematic way, a device which can be used in connection with the present invention.
DETAILED DESCRIPTION
Fig. 1 shows a side view of a vehicle 100. In the shown example, the vehicle comprises a tractor unit 110 and a trailer unit 112. The vehicle 100 can be a heavy vehicle such as a truck. In one example, no trailer unit is connected to the vehicle 100. The vehicle 100 comprises a sensor fusion system 299, see Fig. 2. The sensor fusion system 299 can be arranged in the tractor unit 110.
In one example, the vehicle 100 is a bus. The vehicle 100 can be any kind of vehicle. Other examples of vehicles are boats, passenger cars, construction vehicles, and locomotives.
In the following, the term "link" refers to a communication link which may be a physical connection such as an opto-electronic communication line, or a non-physical connection such as a wireless connection, e.g. a radio link or microwave link.
Fig. 2 shows schematically an embodiment of a sensor fusion system 299. Said sensor fusion system 299 comprises a plurality of sensor arrangements 220a, 220b, 220c, ... Said plurality of sensor arrangements 220a, 220b, 220c, ... comprises at least a first sensor arrangement 220a and a second sensor arrangement 220b. In the shown example said plurality of sensor arrangements 220a, 220b, 220c, ... comprises a third sensor arrangement 220c. Said first sensor arrangement 220a is arranged to sense and identify objects. Said objects are preferably objects outside the vehicle in which the sensor fusion system 299 is provided. Examples of such objects are other vehicles, pedestrians, bicycles, infrastructure such as traffic lights, road signs or crash barriers, stripes on the road, pavements, vegetation, animals, or the like. Said first sensor arrangement 220a can comprise a camera. The camera is in one example arranged to detect visible light. Said visible light can have been emitted or reflected by objects. The camera is in one example arranged to detect infrared light. The camera can be arranged to detect any kind of electromagnetic radiation. The camera is arranged to provide images of the surrounding of the vehicle. Said first sensor arrangement can comprise an analysing unit for the camera. Said analysing unit for the camera is arranged to identify objects in the images which are provided by the camera.
Said first sensor arrangement 220a is arranged to provide a first set of identified objects. In this example the set of identified objects comprises the objects which have been sensed by the camera and identified by the analysing unit for the camera. Here, and in the following, it should be understood that a set of identified objects by a sensor arrangement can comprise any number of identified objects. In one example, no object is identified. In one example one object is identified. In one example more than one object is identified. Said first sensor arrangement 220a is arranged to associate data to said identified objects in said first set of identified objects. Said data can comprise the position of said identified objects. The position of an identified object is in one example provided in one dimension, for example as a distance to the sensor. The position of an identified object is in one example provided in two dimensions, for example as a distance to the sensor and an angle in relation to the sensor. The position of an identified object is in one example provided in three dimensions, for example as a distance to the sensor and two angles in relation to the sensor. The position can also be provided in a one-, two-, or three-dimensional coordinate system, such as a Cartesian coordinate system.
Said associated data can comprise a velocity of the object. Said velocity can be one-, two-, or three-dimensional.
Said associated data comprises value(s) of at least one non-kinematic attribute, NKA. Examples of non-kinematic attributes are anyone of an ID-tag, a kind of object, a colour of the object, a size of the object or the like. As an example, the first sensor arrangement 220a can be arranged to associate a number as a value for an ID-tag as a NKA to each identified object. In one example the first sensor arrangement 220a is arranged to associate a colour value as a NKA to each identified object. In one example, the first sensor arrangement 220a is arranged to associate a value for the kind of object as a NKA to each identified object. Examples of values for kind of objects are bus, passenger car, truck, bicycle, human being, infrastructure, or the like.
Said value(s) of said at least one non-kinematic attribute is/are intended to remain the same for the same object during consecutive providing of a first set of identified objects and associating data to said identified objects in said first set of identified objects by said first sensor arrangement 220a. As an example, the type of object, and/or the colour of the object are intended to remain the same. This is due to the fact that objects typically do not change colour or type. However, accidentally this might of course happen, for example, when an camera first is exposed to direct sunlight and then enters a shadowed area, so that the appearance of the colour of the object can change in the camera. Also the ID-tag of the object or the size of the object are examples of values which are intended to remain the same.
Said first sensor arrangement 220a is performed to operate independently of the other sensor arrangements in the plurality of sensor arrangements 220a, 220b, 220c, ... Said second sensor arrangement 220b is arranged to perform basically the same tasks as the first sensor arrangement 220a. As an example, said second sensor arrangement 220b is arranged to sense and identify objects.
Said second sensor arrangement 220b can comprise a radar arrangement. The radar arrangement is in one example arranged to detect reflected radar beams from objects. The radar arrangement is arranged to provide radar images of the surrounding of the vehicle. Said second sensor arrangement 220b can comprise an analysing unit for the radar arrangement. Said analysing unit for the radar arrangement is arranged to identify objects in the radar images which are provided by the radar arrangement.
Said second sensor arrangement 220b is arranged to provide a second set of identified objects. In this example the second set of identified objects comprises the objects which have been sensed by the radar arrangement and identified by the analysing unit for the radar arrangement.
Said second sensor arrangement 220b is arranged to associate data to said identified objects in said second set of identified objects. The second sensor arrangement 220b is arranged to perform this in a way corresponding to what has been described in relation to the first sensor arrangement 220a and the first set of identified objects.
It should be understood that the first set of identified objects and the second set of identified objects either can contain the same objects, different objects, or partly the same and partly different objects. That the objects, at least partly, can be different objects is in one example due to the fact that the first and the second sensor arrangement 220a and 220b are arranged to sense different parts of the surrounding of the vehicle. Thus, an object visible in the field of view or the sensing range of the first sensor arrangement might not necessarily be visible in the field of view or the sensing range of the second sensor arrangement. That the objects, at least partly, can be different objects is in one example due to the fact that an object which can be sensed by a camera arrangement is not necessarily sensible by the radar arrangement, and vice versa. Said second sensor arrangement 220b is performed to operate independently of the other sensor arrangements in the plurality of sensor arrangements 220a, 220b, 220c, ...
Said third sensor arrangement 220c is arranged to perform basically the same tasks as the first sensor arrangement 220a and the second sensor arrangement 220b. As an example, said third sensor arrangement 220c is arranged to sense and identify objects.
Said third sensor arrangement 220c can comprise a lidar arrangement. The lidar arrangement is in one example arranged to detect reflected laser beams from objects. The lidar arrangement is arranged to provide images of the surrounding of the vehicle based on detection of the reflected laser beams. Said third sensor arrangement 220b can comprise an analysing unit for the lidar arrangement. Said analysing unit for the lidar arrangement is arranged to identify objects in the images which are provided by the lidar arrangement.
Said third sensor arrangement 220c is arranged to provide a third set of identified objects. In this example the third set of identified objects comprises the objects which have been sensed by the lidar arrangement and identified by the analysing unit for the lidar arrangement. Said third sensor arrangement 220c is arranged to associate data to said identified objects in said third set of identified objects. The third sensor arrangement 220c is arranged to perform this in a way corresponding to what has been described in relation to the first sensor arrangement 220a and the first set of identified objects.
What has been said regarding the same or different kind of objects between the first and second set of identified objects applies in a corresponding way also to the third set of identified objects in relation to the first and/or second set of identified objects.
Said third sensor arrangement 220c is performed to operate independently of the other sensor arrangements in the plurality of sensor arrangements 220a, 220b, 220c, ...
The sensor fusion system 299 comprises a fusion centre arrangement. Said fusion centre arrangement can be a first control unit 200.
The first sensor arrangement 220a is arranged to transmit said first set of identified objects and said associated data to the first control unit 200. Said first control unit 200 can be arranged to control operation of said first sensor arrangement 220a. Said first control unit 200 is arranged for communication with said first sensor arrangement 220a via a link L220a. Said first control unit 200 is arranged to receive information from said first sensor arrangement 220a.
The second sensor arrangement 220b is arranged to transmit said first set of identified objects and said associated data to the first control unit 200. Said first control unit 200 can be arranged to control operation of said second sensor arrangement 220b. Said first control unit 200 is arranged for communication with said second sensor arrangement 220b via a link L220b. Said first control unit 200 is arranged to receive information from said second sensor arrangement 220b.
The third sensor arrangement 220c is arranged to transmit said first set of identified objects and said associated data to the first control unit 200. Said first control unit 200 can be arranged to control operation of said third sensor arrangement 220c. Said first control unit 200 is arranged for communication with said third sensor arrangement 220c via a link L220c. Said first control unit 200 is arranged to receive information from said third sensor arrangement 220c.
In one example, the transmission is performed over a CAN-bus of the vehicle. The first control unit 200 is arranged to combine the identified objects from the first, second, and third set of identified objects based on the associated data. The first control unit 200 is arranged to perform the combining so that same objects between the different sets of identified objects are identified. The first control unit 200 is arranged to perform the combining so that a common set of objects is provided. The plurality of sensor arrangements 220a, 220b, 220c, ... and the first control unit 200 are preferably arranged to repeatedly perform the actions that they are arranged for. As an example, the sensing and identifying, the transmitting and the combining can be performed approximately every 17 milliseconds. Any other time period for the repetition is possible as well.
The first control unit 200 is arranged to, when combining said identified objects from the first, second, and third set of identified objects, first combining identified objects out of the first, second, and third set whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects before combining other identified objects from said first, second and third set. This is explained in more detail in relation to Fig. 3b. So far only three sensor arrangements have been described. In general, however, more than the described three sensor arrangements are possible to be part of the plurality of sensor arrangements 220a, 220b, 220c, ... Any of the described first, second, or third sensors could be provided several times. As an example, the sensor fusion system 299 could be provided with a left side radar and a right side radar. Both the left and the right side radar could be arranged to operate as the described second sensor arrangement. Even several cameras, several lidars, or the like can be provided. Even other types of sensor arrangements can be provided as part of the plurality of sensor arrangements. Examples of such sensor arrangements are sensors arranged to receive information from other vehicles or from infrastructure. In one example, a sensor for receiving vehicle-to-vehicle communication is provided. Other vehicle can send information regarding their position, for example achieved via a GPS-receiver, and/or their type of vehicle to the sensor. Another example is a sensor for receiving infrastructure-to-vehicle communication. As an example, the sensor could be arranged to receive information regarding road signs, such as speed limits, status of a traffic light, or the like. The first control unit 200 is in one example arranged to transmit information regarding the common set of objects to a device array 270. Said first control unit 200 can be arranged to control operation of said device array 270. Said first control unit 200 is arranged for communication with said device array 270 via a link L270. Said first control unit 200 can be arranged to receive information from said device array 270. The device array 270 is arranged to receive the transmitted information form the first control unit 200. The device array 270 can comprise one or more devices 271-276, ... In one example, the device array 270 comprises an adaptive cruise control device 271. In one example, the device array 270 comprises a warning system for pedestrians and/or bicyclists 272. In one example, the device array 270 comprises a blind spot warning device 273. In one example, the device array 270 comprises an automatic or assisted queue driving system 274. In one example, the device array 270 comprises an automatic or assisted overtaking system 275. In one example, the device array 270 comprises an automatic or assisted reversing system 276. Said received information can then be provided as input information to any of the device(s) in the device array 270. In principle, any device or system which needs information regarding surrounding objects of the vehicle can be part of the device array 270. In one example, a screen is part of the device array 270. A second control unit 205 is arranged for communication with the first control unit 200 via a link L205 and may be detachably connected to it. It may be a control unit external to the vehicle 100. It may be adapted to conducting the innovative method steps according to the invention. The second control unit 205 may be arranged to perform the inventive method steps according to the invention. It may be used to cross-load software to the first control unit 200, particularly software for conducting the innovative method. It may alternatively be arranged for communication with the first control unit 200 via an internal network on board the vehicle. It may be adapted to performing substantially the same functions as the first control unit 200, such as adapting the control of the gas engine in a vehicle. The innovative method may be conducted by the first control unit 200 or the second control unit 205, or by both of them.
The sensor fusion system 299 can perform any of the method steps described later in relation to Fig. 4.
Fig. 3a shows, in a schematic way, an example of sensor fusion according to prior art. When referring to matching of objects in the description of Fig. 3a and 3b, this relates to the fusion of the sensor data relating to this object. Two sets of identified objects are provided. A first set of identified objects comprises identified objects 300-305. The first set is indicated by circles filled with black colour. A second set of identified objects comprises identified objects 310- 315. The second set is indicated by circles filled with white colour. An object of sensor fusion is to combine the two sets of identified objects so that a common set of identified objects is combined. Especially the same physical objects between the identified objects in the first and the second set have to be found.
According to a prior art solution, a first identified object 300 of the first set is tried to match to all identified objects 310-315 of the second set. This is indicated by continuous lines. A second identified object 301 of the first set is tried to match to all identified objects 310-315 of the second set. This is indicated by dashed lines. This procedure continuous until all objects in the first set and the second set are checked against each other whether they refer to the same physical object. In the shown example, a last identified object 305 of the first set is tried to match to all identified objects 310-315 of the second set. This is indicated by dotted lines. The other identified objects of the first set are also tried to match to the identified objects of the second set. However, the lines indicating these matchings have been omitted for not overloading the figure. The matching could, for example, include calculating the probability that an object from the first set refers to the same physical object as an object from the second set. The two objects with the highest probability of matching are then matched together. Some thresholds or the like can be used to avoid matching objects which in reality do not correspond to the same physical object and which have not been identified in both sets.
Fig. 3b shows, in a schematic way, an example of sensor fusion according to the present invention. Two sets of identified objects are provided. A first set of identified objects comprises identified objects 300-305. The first set is indicated by circles filled with black colour. A second set of identified objects comprises identified objects 310-315. The second set is indicated by circles filled with white colour. The identified objects can be the same as in the prior art solution. However, all identified objects will have a respective NKA associated to them, for example an ID-tag. As an example, the identified objects 300-305 of the first set could have Al, A5, A13, A3, A127, and A21 as values for the respective ID-tag. As an example, the identified objects 310-315 of the second set could have B3, B69, B338, B15, B7, and B28 as values for the respective ID-tag. The values for the NKA are provided independently by the two sensor arrangements, i.e. the first sensor arrangement providing the first set of identified objects does in general not "know" what values for the NKA are provided by the second sensor arrangement, and vice versa.
The sensor providing the first set and the sensor providing the second set are so-called smart sensors or so-called tracking sensors. The sensor arrangement providing the first set is arranged to provide the same value of the NKA to the same sensed and identified physical object every time a sensing and identifying process is performed. The sensor arrangement providing the second set is arranged to provide the same value of the NKA to the same sensed and identified physical object every time a sensing and identifying process is performed. As an example, in case a specific pedestrian is identified as the first identified object 300 in the first set and the value Al is associated to that object, the first sensor arrangement is arranged to associate the value Al again to the object representing the pedestrian next time the sensor arrangement senses and identifies objects, i.e. at the next run of a method according to the present disclosure. At the next run, however, the specific pedestrian does not need to be the first identified object but is, for example, the third identified object. In this case the value Al of the ID-tag will be associated to the third identified object at the next run.
The same applies to the second sensor arrangement. As an example, the same specific pedestrian as discussed in connection to the first sensor arrangement is identified as the third object and receives, as an example, B338 as value for the ID-tag.
When trying to match the objects from the two sets, the fusion centre arrangement will first analyse whether an object with a first value of the NKA of the first set has been matched to an object with a second value of the NKA of the second set before, i.e. at a previous run of the method. As an example, the fusion centre arrangement will receive Al as the value for the ID- tag for the first identified object of the first set. The fusion centre arrangement will then analyse whether Al was matched to a value for the ID-tag in the second set before. The fusion centre arrangement might find that Al has been matched to B338 before and will then find that B338 belongs to the third identified object of the second set. The fusion centre will then match these two objects before matching any objects which have no previously used
"partner" in the second set.
As a result, instead of calculating possibilities with all identified objects in the second set and then deciding what objects to match, which is a computationally intense task, the only thing which has to be done is to compare values for ID-tags, which is a computationally easy task and can be performed relatively fast. In Fig. 3b it is thus indicated by the only continuous line that the first object of the first set can be matched directly to the third object of the second set. As an example, the second object of the first set, having the value A5 as ID-tag, might previously have been matched to an object of the second set having B69 as value of the ID-tag. In that case, the fusion centre
arrangement identifies that the second object in the second set now has B69 as value for the ID-tag and will then match the second object from the first set with the second object from the second set. This is indicated by the only dashed line. After all the objects of the first set which have been matched with an object of the second set before have been matched again because an object with the corresponding value of the NKA is still present in the second set, the remaining objects are tried to match. This is indicated for the last object of the first set by the dotted line. Fig. 4 shows, in a schematic way, a flow chart over an example of a method 400 for decentralised sensor fusion in a vehicle according to the present invention. Not all steps of the described method are necessary. Instead, the method is presented in a way in which it might be implemented in a vehicle. However, several of the steps are optional and can thus be omitted in other examples of the method. Which steps are optional will in general depend on the specific implementation on a specific vehicle.
The method starts with a step 410 of sensing and identifying objects. Said sensing can be performed by camera arrangements, radar arrangements, lidar arrangements or the like as described in relation to Fig. 2. Said sensing can also be performed by receiving information which is send out from other vehicles or infrastructure as described in relation to Fig. 2. Step 410 is performed independently by a plurality of sensor arrangements. Each sensor arrangement is arranged to identify objects based on the data it receives through the sensing. The sensing and identifying of the objects is performed by each sensor arrangement in such a way that a set of identified objects is provided. Step 410 can be performed repeatedly. The method continues with step 420.
In step 420 data is associated to the identified objects in the set of objects. This is performed independently by each sensor for the respective set of identified objects. Said associated data can comprise position and velocity of the objects as described before. Said associated data comprises value(s) of at least one non-kinematic attribute, NKA. Said value(s) of the at least one NKA is/are independently provided by each sensor arrangement. Said value(s) of said at least one NKA is/are intended to remain the same for the same object identified by the same sensor arrangement during consecutive runs of the method 400. As an example, in case a first car is identified by a first sensor arrangement, it is intended that the same value of the NKA always is associated to said first car. This can be achieved in a number of different ways. In case the first car is adapted to send a unique ID-number, the sensor arrangement can be arranged to analyse the ID-numbers of other cars and always associate the same value of the NKA to cars sending out the same ID-number. In another example, a sensor arrangement comprises a camera which provides images. The provided images can be analysed by the sensor arrangement and values for the NKA can be associated to each identified object in the image. The sensor arrangement typically is arranged to repeat step 410 in the order of some tens of milliseconds. Typical objects surrounding a vehicle will not move very far during such a time period. Step 420 can be performed repeatedly. The sensor arrangement can then be arranged to associate the same value for the NKA to each identified object which is basically on the same place in the provided image on a next run of the method 400. There are more examples known in the art how sensor arrangements can keep track of identified objects. Basically any such sensor arrangement can be used to associate the same value of the NKA to the same identified object in a next run of step 420. The method continues with step 430.
In step 430 the set of identified objects and the associated data is transmitted to a fusion centre arrangement. Said fusion centre arrangement can be an electronic control unit as described in relation to Fig. 2. Said fusion centre arrangement can be an existing electronic control unit in the vehicle. Said electronic control unit can be situated at a distance from the sensor arrangement. In one example, said fusion centre arrangement is implemented at one sensor arrangement out of the plurality of sensor arrangements. The method continues with step 440. All of the following steps are preferably performed by the fusion centre
arrangement.
In step 440 it is analysed which value(s) of said at least one non-kinematic attribute of the identified objects out of the different sets had been combined previously and had been identified as same objects. This can be done in the following way: For an object in a first set of identified objects the value(s) of the NKA is/are determined.
These value(s) of the NKA can be denoted first value(s) of the NKA. It is analysed whether that object of the first set previously has been identified as representing the same physical object as an identified object in any of the other sets of identified objects. This can be done by analysing, for example in a look-up table, whether said first value(s) of the NKA had previously been combined with second value(s) of the NKA attributed to identified objects in any other set of identified objects. It can then be analysed whether any of said second value(s) of the NKA are present in any other of the present sets of identified objects. If this is the case, it can be concluded that the object(s) in the other sets of identified objects having said second value(s) of the NKA still relate to the same physical object. For the objects of the different sets of identified objects which have been identified as representing the same physical object the method continues with step 470. In one example, step 490 is performed before continuing with step 470 for the objects of the different sets of identified objects which have been identified as representing the same physical object. The above described can be done for all identified objects of all sets of identified objects. In one example, it is analysed whether objects of the different sets of identified objects have been identified as representing the same physical object for a pre-determined number of times. As an example, said pre-determined number of times can be two, three, four, five, or more times. Only if these objects have been analysed as representing the same physical object for at least said pre-determined number of times the method continues with step 470 for these objects. This assures that accidental identification of identifying objects as representing the same physical objects during one or a few runs of the method will not lead to the fact that these objects will accidentally be analysed as the same objects when continuously performing the method 400. In case identified objects have been identified as representing the same physical object for a pre-determined number of times it can be concluded that the
identification as the same physical object likely is not accidental.
The method will continue with step 450 for the objects which do not continue with step 490 or step 470.
In step 450 the data relating to the remaining objects is gated. This is performed in accordance to any gating process in combination to sensor fusion which is known in the art. The method continues with step 460.
In step 460 an association algorithm is performed. This algorithm is intended to find identified objects representing the same physical object among the remaining objects in the sets of identified objects. Such association algorithms are well known in the art. The method continues with step 470. In one example, the order of the steps 460 and 470 is reversed. In step 470 data relating to the same physical object is fused. The process of data fusion for data relating to the same physical object is known in the art. That data relates to the same physical object is known either from step 460 or from step 440, or from step 490, respectively. The method continues with step 480. In the optional step 490 it is analysed whether it is plausible that identified objects from at least two sets of identified objects really represent the same physical object. In other words it is analysed whether the identification as the same physical object based on the value(s) of NKA is reasonable. Such a plausibility test can be performed in a number of ways. In one example, it is analysed whether the identified objects have approximately the same velocity. In one example, it is analysed whether the identified objects have approximately the same position. In one example, it is analysed whether the identified objects are of the same type, have approximately the same colour, approximately the same size, and/or have approximately the same values for any other parameter. A plausibility test can also comprise a combination of any of the above examples. When referring to the term approximately it should be understood that every sensor arrangement has a certain degree of uncertainty in determining data or parameter relating to the identified objects. As an example, the determined velocity will have a certain degree of uncertainty. The degree of uncertainty can differ between different sensor arrangements. In one example, it is determined that it is plausible that identified objects represent the same physical objects if the values of a parameter including its uncertainty from different sensor arrangements at least partly overlap. As an example, a first sensor arrangement could have determined the velocity of an object as 16 km/h±4 km/h and a second sensor arrangement could have determined the velocity of an object as 22 km/h ± 3 km/h. It can then be determined that the values for the velocities partly overlap when including the uncertainty and that it therefore is plausible that the identified objects represent the same physical object. More complex algorithms known in the art can be used as well.
In case it is determined that it is plausible that identified objects from at least two sets of identified objects really represent the same physical object the method continues with step 470 for these objects. In case it is determined that it is not plausible that identified objects from at least two sets of identified objects really represent the same physical object the method continues with step 450 for these objects.
In step 480 a track management procedure is performed. This track management procedure can include storing information regarding which of the identified objects from the different sets of identified objects have been determined to represent the same physical object. This track management procedure can include storing the values of the NKA for the identified objects which have been determined to represent the same physical object. In one example the method 400 ends after step 480. In a preferred example, the method 400 starts again. In a preferred example the method 400 is performed repeatedly during operation of the vehicle.
Figure 5 is a diagram of one version of a device 500. The control units 200 and 205 described with reference to Figure 2 may in one version comprise the device 500. The device 500 comprises a non-volatile memory 520, a data processing unit 510 and a read/write memory 550. The non-volatile memory 520 has a first memory element 530 in which a computer program, e.g. an operating system, is stored for controlling the function of the device 500. The device 500 further comprises a bus controller, a serial communication port, I/O means, an A/D converter, a time and date input and transfer unit, an event counter and an interruption controller (not depicted). The non-volatile memory 520 has also a second memory element 540.
The computer program comprises routines for controlling a gas engine, wherein the gas engine is supplied with a fuel gas which consists of different kinds of molecules and which is stored in at least a gaseous phase and a liquid phase in a gas storage device. The computer program P may comprise routines for decentralised sensor fusion in a vehicle. This may at least partly be performed by means of said first control unit 200 and said plurality of sensor arrangement 220a, 220b, 220c, ... The computer program P may comprise routines sensing and identifying objects with a plurality of sensor arrangements. The computer program P may comprise routines for associate data to said identified objects in the sets of identified objects. Said associated data comprises value(s) of at least one non-kinematic attribute. This might at least partly be performed by said plurality of sensor arrangements 220a, 220b, 220c, ... Said associated data, especially said value(s) for the NKA can be stored in said non-volatile memory 520.
The computer program P may comprise routines for combining the identified objects from all said respective sets of identified objects based on said associated data so that same objects between the different respective sets of identified objects are identified and so that a common set of objects is provided. This may at least partly be performed by means of said first control unit 200. The computer program P may comprise routines first combining identified objects out of different sets of identified objects whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects before combining other identified objects from said respective sets. This may at least partly be performed by means of said first control unit 200, for example based on accessing stored values for the NKA from said non-volatile memory 520.
The computer program P may comprise routines for determining whether objects have been identified as the same object for a pre-determined number of previous runs of the method. This may at least partly be performed by means of said first control unit. The computer program P may comprise routines for determining whether an identification of objects as the same physical object is plausible. This may at least partly be performed by means of said first control unit 200.
The program P may be stored in an executable form or in compressed form in a memory 560 and/or in a read/write memory 550. Where it is stated that the data processing unit 510 performs a certain function, it means that it conducts a certain part of the program which is stored in the memory 560 or a certain part of the program which is stored in the read/write memory 550.
The data processing device 510 can communicate with a data port 599 via a data bus 515. The non-volatile memory 520 is intended for communication with the data processing unit 510 via a data bus 512. The separate memory 560 is intended to communicate with the data processing unit via a data bus 511. The read/write memory 550 is arranged to communicate with the data processing unit 510 via a data bus 514. The links L205, L220, L240, L250, and L270, for example, may be connected to the data port 599 (see Figure 2).
When data are received on the data port 599, they can be stored temporarily in the second memory element 540. When input data received have been temporarily stored, the data processing unit 510 can be prepared to conduct code execution as described above.
Parts of the methods herein described may be conducted by the device 500 by means of the data processing unit 510 which runs the program stored in the memory 560 or the read/write memory 550. When the device 500 runs the program, methods herein described are executed. The foregoing description of the preferred embodiments of the present invention is provided for illustrative and descriptive purposes. It is neither intended to be exhaustive, nor to limit the invention to the variants described. Many modifications and variations will obviously suggest themselves to one skilled in the art. The embodiments have been chosen and described in order to best explain the principles of the invention and their practical applications and thereby make it possible for one skilled in the art to understand the invention for different embodiments and with the various modifications appropriate to the intended use. It should especially be noted that the system according to the present disclosure can be arranged to perform any of the steps or actions described in relation to the method 400. It should also be understood that the method according to the present disclosure can further comprise any of the actions attributed to an element of the sensor fusion system 299 described in relation to Fig. 2. The same applies to the computer program and the computer program product.

Claims

A method (400) for decentralised sensor fusion in a vehicle, the method comprising the steps of:
- sensing and identifying (410) objects with a plurality of sensor arrangements, wherein said sensing and identifying is performed independently by each sensor arrangement out of said plurality of sensor arrangements so that each sensor arrangement provides a respective set of identified objects;
- for each sensor arrangement out of said plurality of sensor arrangements: associating (420), by said sensor arrangement, data to said identified objects in said respective set of identified objects;
- combining (440-470) the identified objects from all said respective sets of identified objects based on said associated data so that same objects between the different respective sets of identified objects are identified and so that a common set of objects is provided;
characterised in that said associated data comprises value(s) of at least one non- kinematic attribute which is provided by the corresponding sensor arrangement, wherein said value(s) of said at least one non-kinematic attribute is/are intended to remain the same for the same object identified by the same sensor arrangement during consecutive runs of the method, and characterised in that the step of combining said identified objects from all said respective sets of identified objects comprises first combining (440; 470) identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects before combining (450-470) other identified objects from said respective sets.
The method according to the previous claim, wherein said non-kinematic attribute is an ID-tag.
The method according to anyone of the previous claims, wherein said method is repeatedly performed during operation of the vehicle.
The method according to anyone of the previous claims, wherein said first combining of identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects is only performed for objects which have been identified as the same object for a pre-determined number of previous runs of the method.
5. The method according to anyone of the previous claims, wherein said first combining of identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects is only performed for objects for which a plausibility check (490) indicates that identifying these objects as the same object again is plausible.
6. A sensor fusion system (299), said system comprising:
- a plurality of sensor arrangements (220a, 220b, 220c, ...), wherein each sensor arrangement out of the plurality of sensor arrangements (220a, 220b, 220c, ...) is arranged to independently sense and identify objects, to provide a respective set of identified objects, to associate data to said identified objects in said respective set of identified objects, and to transmit said respective set of identified objects and said associated data to a fusion centre arrangement (200; 205);
- the fusion centre arrangement (200; 205), being arranged to combine the identified objects from all said respective sets of identified objects based on said associated data so that same objects between the different respective sets of identified objects are identified and so that a common set of objects is provided; characterised in that said associated data comprises value(s) of at least one non- kinematic attribute, wherein said value(s) of said at least one non-kinematic attribute is/are intended to remain the same for the same object identified by the same sensor arrangement during consecutive providing of a respective set of identified objects and associating data to said identified objects in said respective set of identified objects by each sensor arrangement, and characterised in that the fusion centre arrangement
(200; 205) is arranged to, when combining said identified objects from all said respective sets of identified objects, first combining identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects before combining other identified objects from said respective sets.
7. The system according to the previous claim, wherein said non-kinematic attribute is an ID-tag.
8. The system according to anyone of the claim 6-7, wherein said fusion centre arrangement (200; 205) is arranged to first combine the identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects only for objects which have been identified as the same object for a pre-determined number of times.
9. The system according to anyone of claim 6-8, wherein the fusion centre arrangement (200; 205) further is arranged to perform plausibility checks and to perform said first combining of identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects only for objects for which the plausibility check indicates that identifying these objects as the same object again is plausible.
10. A vehicle (100) comprising a sensor fusion system (299) according to anyone of claim 6-9.
11. A computer program (P) for decentralised sensor fusion in a vehicle, wherein said computer program (P) comprises program code for causing an electronic control unit (200; 500) or a computer (205; 500) connected to the electronic control unit (200; 500) to perform the steps according to any of the claims 1-5.
12. A computer program product containing a program code stored on a computer- readable medium for performing method steps according to any of claims 1-5, when said computer program is run on an electronic control unit (200; 500) or a computer (205; 500) connected to the electronic control unit (200; 500).
PCT/SE2017/050495 2016-05-25 2017-05-15 Method for decentralised sensor fusion in a vehicle and sensor fusion system WO2017204719A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1650719A SE1650719A1 (en) 2016-05-25 2016-05-25 Method for decentralised sensor fusion in a vehicle and sensor fusion system
SE1650719-6 2016-05-25

Publications (1)

Publication Number Publication Date
WO2017204719A1 true WO2017204719A1 (en) 2017-11-30

Family

ID=60411762

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2017/050495 WO2017204719A1 (en) 2016-05-25 2017-05-15 Method for decentralised sensor fusion in a vehicle and sensor fusion system

Country Status (2)

Country Link
SE (1) SE1650719A1 (en)
WO (1) WO2017204719A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019205504A1 (en) * 2019-04-16 2020-10-22 Zf Friedrichshafen Ag Control device and method as well as computer program product
KR20210068110A (en) * 2018-09-30 2021-06-08 그레이트 월 모터 컴퍼니 리미티드 Target tracking method and device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019102920A1 (en) * 2019-02-06 2020-08-06 Bayerische Motoren Werke Aktiengesellschaft Method and device for sensor data fusion for a vehicle

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050021201A1 (en) * 2001-07-17 2005-01-27 Albrecht Klotz Method and device for data exchange and processing
US7283938B1 (en) * 2004-04-15 2007-10-16 Lockheed Martin Corporation Virtual sensor for data and sensor fusion
US20080235318A1 (en) * 2007-02-09 2008-09-25 Deepak Khosla Information Processing System for Classifying and/or Tracking an Object
US20130091503A1 (en) * 2011-10-05 2013-04-11 Amichai Painsky Data fusion in high computational load environments

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050021201A1 (en) * 2001-07-17 2005-01-27 Albrecht Klotz Method and device for data exchange and processing
US7283938B1 (en) * 2004-04-15 2007-10-16 Lockheed Martin Corporation Virtual sensor for data and sensor fusion
US20080235318A1 (en) * 2007-02-09 2008-09-25 Deepak Khosla Information Processing System for Classifying and/or Tracking an Object
US20130091503A1 (en) * 2011-10-05 2013-04-11 Amichai Painsky Data fusion in high computational load environments

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
DURAISAMY B. ET AL.: "Combi-Tor: Track-to-Track Association Framework for Automotive Sensor Fusion", 2015 IEEE 18TH INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS, 15 September 2015 (2015-09-15), XP032804222 *
HORRIDGE P. ET AL.: "XMAP: Track-to-Track Association with Metric, Feature, and Target-type Data", 2006 9 TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION, 1 July 2006 (2006-07-01), XP031042393, ISBN: 978-1-4244-0953-2 *
HOUENOU A. ET AL.: "A Track-To-Track Association Method for Automotive Perception Systems", 2012 INTELLIGENT VEHICLES SYMPOSIUM, 3 June 2012 (2012-06-03), XP032453052, Retrieved from the Internet <URL:https://hal.archives-ouvertes.fr/hal-00740787/document> [retrieved on 20161203] *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210068110A (en) * 2018-09-30 2021-06-08 그레이트 월 모터 컴퍼니 리미티드 Target tracking method and device
EP3859595A4 (en) * 2018-09-30 2021-11-10 Great Wall Motor Company Limited Target tracking method and device
KR102473272B1 (en) * 2018-09-30 2022-12-02 그레이트 월 모터 컴퍼니 리미티드 Target tracking method and device
DE102019205504A1 (en) * 2019-04-16 2020-10-22 Zf Friedrichshafen Ag Control device and method as well as computer program product

Also Published As

Publication number Publication date
SE1650719A1 (en) 2017-11-26

Similar Documents

Publication Publication Date Title
CN106485949B (en) The sensor of video camera and V2V data for vehicle merges
KR101942109B1 (en) Method and system for validating information
US9165198B2 (en) Method for identifying a vehicle during vehicle-to-vehicle communication
US10410513B2 (en) Fusion of non-vehicle-to-vehicle communication equipped vehicles with unknown vulnerable road user
US9959765B2 (en) System and method for providing alert to a vehicle or an advanced driver assist system based on vehicle dynamics input
JP6714513B2 (en) An in-vehicle device that informs the navigation module of the vehicle of the presence of an object
WO2020057406A1 (en) Driving aid method and system
CN110874927A (en) Intelligent road side unit
CN113287073A (en) Automatic driving automobile simulator using network platform
WO2017204719A1 (en) Method for decentralised sensor fusion in a vehicle and sensor fusion system
US20230237783A1 (en) Sensor fusion
CN112649809A (en) System and method for fusing sensor data in a vehicle
JP2021513643A (en) Methods and devices for detecting critical lateral movements
CN111341148A (en) Control system and control method for a motor vehicle for processing multiple reflection signals
CN112009468A (en) Multi-hypothesis object tracking for autonomous driving systems
KR20240047408A (en) Detected object path prediction for vision-based systems
US20170092121A1 (en) Method and System for Determining and Using Property Associations
US20230311858A1 (en) Systems and methods for combining detected objects
JP7326429B2 (en) How to select the sensor image interval
Sathiyan et al. A comprehensive review on cruise control for intelligent vehicles
CN113771845A (en) Method, device, vehicle and storage medium for predicting vehicle track
US20220258765A1 (en) Method for modeling the surroundings of an automated vehicle
US11994622B2 (en) Methods and systems for providing scan data for decision making by a self-driving vehicle
CN117075526B (en) Remote control method and device for automatic driving vehicle
WO2022091620A1 (en) Object tracking device and object tracking method

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17803163

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17803163

Country of ref document: EP

Kind code of ref document: A1