SE1650719A1 - Method for decentralised sensor fusion in a vehicle and sensor fusion system - Google Patents

Method for decentralised sensor fusion in a vehicle and sensor fusion system Download PDF

Info

Publication number
SE1650719A1
SE1650719A1 SE1650719A SE1650719A SE1650719A1 SE 1650719 A1 SE1650719 A1 SE 1650719A1 SE 1650719 A SE1650719 A SE 1650719A SE 1650719 A SE1650719 A SE 1650719A SE 1650719 A1 SE1650719 A1 SE 1650719A1
Authority
SE
Sweden
Prior art keywords
objects
identified
sensor
same
identified objects
Prior art date
Application number
SE1650719A
Other languages
Swedish (sv)
Inventor
Larsson Christian
LUNDIN Hjalmar
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE1650719A priority Critical patent/SE1650719A1/en
Priority to PCT/SE2017/050495 priority patent/WO2017204719A1/en
Publication of SE1650719A1 publication Critical patent/SE1650719A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/22Character recognition characterised by the type of writing
    • G06V30/224Character recognition characterised by the type of writing of printed characters having additional code marks or containing code marks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present disclosure relates to a sensor fusion system comprising a plurality of sensor arrangements, each arranged to independently sense and identify objects, to provide a respective set of identified objects, to associate data to said identified objects in said respective set of identified objects, and to transmit said respective set of identified objects and said associated data to a fusion centre arrangement.The associated data comprises value(s) of at least one non-kinematic attribute intended to remain the same for the same object identified by the same sensor arrangement during consecutive providing of a respective set of identified objects.The fusion centre arrangement is arranged to combine the identified objects from all said respective sets of identified objects based on said associated data so that same objects between the different respective sets of identified objects are identified so that a common set of objects is provided by first combining identified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects before combining other identified objects from said respective sets.The present disclosure also relates to a method for decentralised sensor fusion in a vehicle, to a vehicle, to a computer program, and to a computer program product.

Description

1 Method for decentralised sensor fusion in a vehicle and sensor fusion system TECHNICAL FIELD The present disclosure relates to a method for decentralised sensor fusion in a vehicle and asensor fusion system. lt further relates to a vehicle, a computer program and a computer program product.
BACKGROUND ART Sensor fusion is used to improve the information about objects which are sensed by morethan one sensor. For fusing information from two sources, such as two sensors, it is importantto identify which part of the information from the two sources relates to the same object(s)and which part does not. The information which relates to the same object is then fused.There are two main principles of sensor fusion system: measurement-to-track associations and track-to-track associations. ln measurement-to-track associations, even called centralised sensor fusion or low-levelfusion, basically all measurement results from the sensors are transmitted to a fusion centre.ln such a transmission of basically all measurement results also noise is included. The fusioncentre is then arranged to analyse the measured data and to determine what informationcould be extracted from the measured data and how this information relates, or partly relates, to the same object(s) or to different objects. ln track-to-track associations, even called decentralised sensor fusion or high-level fusion, thesensors do some analysis of the measured or sensed data. As an example, objects can beidentified by the sensors. These sensors are sometimes called smart sensors. Only theanalysed data is then transmitted to the fusion centre. The fusion centre is then arranged todetermine which part of the analysed data from different sensors relates to the same object(s) and which part relates to different objects. 2Both track-to-track associations and measurement-to-track associations are computationally intensive.SUMMARY OF TH E INVENTION lt is thus an objective of the present disclosure to provide a less computational intensivedecentralised sensor fusion. Another objective of the present disclosure is to provide an alternative way of performing decentralised sensor fusion.
At least some ofthe objectives are achieved by a method for decentralised sensor fusion in avehicle. The method comprises sensing and identifying objects with a plurality of sensorarrangements. Said sensing and identifying is performed independently by each sensorarrangement out of said plurality of sensor arrangements so that each sensor arrangementprovides a respective set of identified objects. The method further comprises for each sensorarrangement out of said plurality of sensor arrangements: associating, by said sensorarrangement, data to said identified objects in said respective set of identified objects. Themethod even further comprises combining the identified objects from all said respective setsof identified objects based on said associated data so that same objects between the differentrespective sets of identified objects are identified and so that a common set of objects isprovided. Said associated data comprises value(s) of at least one non-kinematic attributewhich is provided by the corresponding sensor arrangement. Said value(s) of said at least onenon-kinematic attribute is/are intended to remain the same for the same object identified bythe same sensor arrangement during consecutive runs of the method. Com bining saididentified objects from all said respective sets of identified objects comprises first combiningidentified objects out of different respective sets whose value(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects before combining other identified objects from said respective sets.
This has the advantage that the steps of gating and associating objects to each other can beskipped for certain objects in a decentralised sensor fusion method. Since gating andassociation are computationally intensive, this results in the fact that sensor fusion can beperformed faster, and/or on a less complex hardware, and/or with more sensors included. As a consequence either hardware cost can be lowered and/or the quality of the information 3available from a plurality of sensors can be increased. This might especially useful in increasing the performance of autonomous or assisted systems for vehicle driving. ln one example of the method, said non-kinematic attribute is an ID-tag. This provides an easy way of associating a non-kinematic attribute to an object. ln one example, the method is repeatedly performed during operation of the vehicle. Thisallows for constantly providing input data to autonomous or assisted systems for vehicle driving. ln one example of the method, said first combining of identified objects out of differentrespective sets whose value(s) of said at least one non-kinematic attribute had been combinedpreviously and had been identified as same objects is only performed for objects which havebeen identified as the same object for a pre-determined number of previous runs of themethod. This allows increasing the quality of the method by sorting out accidental identification of objects in different sets as the same physical object. ln one example, said first combining of identified objects out of different respective setswhose value(s) of said at least one non-kinematic attribute had been combined previously andhad been identified as same objects is only performed for objects for which a plausíbility checkindicates that identifying these objects as the same object again is plausi ble. This allowsincreasing the quality of the method by sorting out non-justified identifications of objects in different sets as the same physical object.
At least some ofthe objectives are achieved by a sensor fusion system. The sensor fusionsystem comprises a plurality of sensor arrangements. Each sensor arrangement out of theplurality of sensor arrangements is arranged to independently sense and identify objects, toprovide a respective set of identified objects, to associate data to said identified objects in saidrespective set of identified objects, and to transmit said respective set of identified objectsand said associated data to a fusion centre arrangement. The sensor fusion system furthercomprises a fusion centre arrangement. The fusion centre arrangement is arranged tocombine the identified objects from all said respective sets of identified objects based on saidassociated data so that same objects between the different respective sets of identified objects are identified and so that a common set of objects is provided. Said associated data 4 comprises va|ue(s) of at least one non-kinematic attribute, wherein said va|ue(s) of said atleast one non-kinematic attribute is/are intended to remain the same for the same objectidentified by the same sensor arrangement during consecutive providing of a respective set ofidentified objects and associating data to said identified objects in said respective set ofidentified objects by each sensor arrangement. The fusion centre arrangement is arranged to,when combining said identified objects from all said respective sets of identified objects, firstcombining identified objects out of different respective sets whose va|ue(s) of said at least onenon-kinematic attribute had been combined previously and had been identified as same objects before combining other identified objects from said respective sets.ln one embodiment of the system said non-kinematic attribute is an ID-tag. ln one embodiment, said fusion centre arrangement is arranged to first combine the identifiedobjects out of different respective sets whose va|ue(s) of said at least one non-kinematicattribute had been combined previously and had been identified as same objects only for objects which have been identified as the same object for a pre-determined number of times. ln one embodiment, the fusion centre arrangement further is arranged to perform plausibilitychecks and to perform said first combining of identified objects out of different respective setswhose va|ue(s) of said at least one non-kinematic attribute had been combined previously andhad been identified as same objects only for objects for which the plausibility check indicates that identifying these objects as the same object again is plausible.
At least some ofthe objectives are achieved by a vehicle comprising a sensor fusion system according to the present disclosure.
At least some ofthe objectives are achieved by a computer program for decentralised sensorfusion in a vehicle. Said computer program comprises program code for causing an electroniccontrol unit or a computer connected to the electronic control unit to perform the steps according to the method of the present disclosure.
At least some ofthe objects are also achieved by a computer program product containing aprogram code stored on a computer-readable medium for performing the method accordingto the present disclosure, when said computer program is run on an electronic control unit or a computer connected to the electronic control unit. 5The system, the vehicle, the computer program and the computer program product havecorresponding advantages as have been described in connection with the corresponding examples of the method according to this disclosure.
Further advantages of the present invention are described in the following detailed description and/or will arise to a person skilled in the art when performing the invention.
BRIEF DESCRIPTION OF THE DRAWINGS For a more detailed understanding of the present invention and its objects and advantages,reference is made to the following detailed description which should be read together withthe accompanying drawíngs. Same reference numbers refer to same components in the different figures. ln the following, Fig. 1 shows, in a schematic way, a vehicle according to one embodiment of the present invention; Fig. 2 shows, in a schematic way, a sensor fusion system according to one embodiment of the present invention;Fig. 3a shows, in a schematic way, an example of sensor fusion according to prior art; Fig. 3b shows, in a schematic way, an example of sensor fusion according to the present disclosure; Fig. 4 shows, in a schematic way, a flow chart over an example of a method according to the present invention; and Fig. 5 shows, in a schematic way, a device which can be used in connection with the present invention.
DETAILED DESCRIPTION Fig. 1 shows a side view of a vehicle 100. ln the shown example, the vehicle comprises a tractor unit 110 and a trailer unit 112. The vehicle 100 can be a heavy vehicle such as a truck. 6ln one example, no trailer unit is connected to the vehicle 100. The vehicle 100 comprises asensor fusion system 299, see Fig. 2. The sensor fusion system 299 can be arranged in the tractor unit 110. ln one example, the vehicle 100 is a bus. The vehicle 100 can be any kind of vehicle. Other examples of vehicles are boats, passenger cars, construction vehicles, and locomotíves. ln the following, the term ”link” refers to a communication link which may be a physicalconnection such as an opto-electronic communication line, or a non-physical connection such as a wireless connection, e.g. a radio link or microwave link.
Fig. 2 shows schematically an embodiment of a sensor fusion system 299. Said sensor fusionsystem 299 comprises a plurality of sensor arrangements 220a, 220b, 220c, Said plurality ofsensor arrangements 220a, 220b, 220c, comprises at least a first sensor arrangement 220aand a second sensor arrangement 220b. ln the shown example said plurality of sensor arrangements 220a, 220b, 220c, comprises a third sensor arrangement 220c.
Said first sensor arrangement 220a is arranged to sense and identify objects. Said objects arepreferably objects outside the vehicle in which the sensor fusion system 299 is provided.Examples of such objects are other vehicles, pedestrians, bicycles, infrastructure such as trafficlights, road signs or crash barriers, stripes on the road, pavements, vegetation, animals, or the like.
Said first sensor arrangement 220a can comprise a camera. The camera is in one examplearranged to detect visible light. Said visible light can have been emitted or reflected byobjects. The camera is in one example arranged to detect infrared light. The camera can bearranged to detect any kind of electromagnetic radiation. The camera is arranged to provideimages of the surrounding of the vehicle. Said first sensor arrangement can comprise ananalysing unit for the camera. Said analysing unit for the camera is arranged to identify objects in the images which are provided by the camera.
Said first sensor arrangement 220a is arranged to provide a first set of identified objects. lnthis example the set of identified objects comprises the objects which have been sensed by the camera and identified by the analysing unit for the camera. 7Here, and in the following, it should be understood that a set of identified objects by a sensorarrangement can comprise any number of identified objects. ln one example, no object isidentified. ln one example one object is identified. ln one example more than one object is identified.
Said first sensor arrangement 220a is arranged to associate data to said identified objects insaid first set of identified objects. Said data can comprise the position of said identifiedobjects. The position of an identified object is in one example provided in one dimension, forexample as a distance to the sensor. The position of an identified object is in one exampleprovided in two dimensions, for example as a distance to the sensor and an angle in relationto the sensor. The position of an identified object is in one example provided in threedimensions, for example as a distance to the sensor and two angles in relation to the sensor.The position can also be provided in a one-, two-, or three-dimensional coordinate system, such as a Cartesian coordinate system.
Said associated data can comprise a velocity of the object. Said velocity can be one-, two-, or three-dimensional.
Said associated data comprises value(s) of at least one non-kinematic attribute, NKA. Examplesof non-kinematic attributes are anyone of an ID-tag, a kind of object, a colour of the object, asize of the object or the like. As an example, the first sensor arrangement 220a can bearranged to associate a number as a value for an ID-tag as a NKA to each identified object. lnone example the first sensor arrangement 220a is arranged to associate a colour value as aNKA to each identified object. ln one example, the first sensor arrangement 220a is arrangedto associate a value for the kind of object as a NKA to each identified object. Examples ofvalues for kind of objects are bus, passenger car, truck, bicycle, human being, infrastructure, or the like.
Said value(s) of said at least one non-kinematic attribute is/are intended to remain the samefor the same object during consecutive providing of a first set of identified objects andassociating data to said identified objects in said first set of identified objects by said firstsensor arrangement 220a. As an example, the type of object, and/or the colour of the objectare intended to remain the same. This is due to the fact that objects typically do not change colour or type. However, accidentally this might of course happen, for example, when an 8camera first is exposed to direct sunlight and then enters a shadowed area, so that theappearance of the colour of the object can change in the camera. Also the ID-tag of the object or the size of the object are examples of values which are intended to remain the same.
Said first sensor arrangement 220a is performed to operate independently of the other sensor arrangements in the plurality of sensor arrangements 220a, 220b, 220c, Said second sensor arrangement 220b is arranged to perform basically the same tasks as thefirst sensor arrangement 220a. As an example, said second sensor arrangement 220b is arranged to sense and identify objects.
Said second sensor arrangement 220b can comprise a radar arrangement. The radararrangement is in one example arranged to detect reflected radar beams from objects. Theradar arrangement is arranged to provide radar images of the surrounding of the vehicle. Saidsecond sensor arrangement 220b can comprise an analysing unit for the radar arrangement.Said analysing unit for the radar arrangement is arranged to identify objects in the radar images which are provided by the radar arrangement.
Said second sensor arrangement 220b is arranged to provide a second set of identifiedobjects. ln this example the second set of identified objects comprises the objects which havebeen sensed by the radar arrangement and identified by the analysing unit for the radar arfangemefït.
Said second sensor arrangement 220b is arranged to associate data to said identified objectsin said second set of identified objects. The second sensor arrangement 220b is arranged toperform this in a way corresponding to what has been described in relation to the first sensor arrangement 220a and the first set of identified objects. lt should be understood that the first set of identified objects and the second set of identifiedobjects either can contain the same objects, different objects, or partly the same and partlydifferent objects. That the objects, at least partly, can be different objects is in one exampledue to the fact that the first and the second sensor arrangement 220a and 220b are arrangedto sense different parts of the surrounding of the vehicle. Thus, an object visible in the field ofview or the sensing range of the first sensor arrangement might not necessarily be visible in the field of view or the sensing range of the second sensor arrangement. That the objects, at 9least partly, can be different objects is in one example due to the fact that an object which canbe sensed by a camera arrangement is not necessarily sensible by the radar arrangement, and vice versa.
Said second sensor arrangement 220b is performed to operate independently of the other sensor arrangements in the pluralíty of sensor arrangements 220a, 220b, 220c, Said third sensor arrangement 220c is arranged to perform basically the same tasks as the firstsensor arrangement 220a and the second sensor arrangement 220b. As an example, said third sensor arrangement 220c is arranged to sense and identify objects.
Said third sensor arrangement 220c can comprise a |idar arrangement. The |idar arrangementis in one example arranged to detect reflected laser beams from objects. The |idararrangement is arranged to provide images of the surrounding of the vehicle based ondetection of the reflected laser beams. Said third sensor arrangement 220b can comprise ananalysing unit for the |idar arrangement. Said analysing unit for the |idar arrangement is arranged to identify objects in the images which are provided by the |idar arrangement.
Said third sensor arrangement 220c is arranged to provide a third set of identified objects. lnthis example the third set of identified objects comprises the objects which have been sensed by the |idar arrangement and identified by the analysing unit for the |idar arrangement.
Said third sensor arrangement 220c is arranged to associate data to said identified objects insaid third set of identified objects. The third sensor arrangement 220c is arranged to performthis in a way corresponding to what has been described in relation to the first sensor arrangement 220a and the first set of identified objects.
What has been said regarding the same or different kind of objects between the first andsecond set of identified objects applies in a corresponding way also to the third set of identified objects in relation to the first and/or second set of identified objects.
Said third sensor arrangement 220c is performed to operate independently of the other sensor arrangements in the pluralíty of sensor arrangements 220a, 220b, 220c, The sensor fusion system 299 comprises a fusion centre arrangement. Said fusion centre arrangement can be a first control unit 200.
The first sensor arrangement 220a is arranged to transmit said first set of identified objectsand said associated data to the first control unit 200. Said first control unit 200 can bearranged to control operation of said first sensor arrangement 220a. Said first control unit 200is arranged for communication with said first sensor arrangement 220a via a link L220a. Saidfirst control unit 200 is arranged to receive information from said first sensor arrangement 220a.
The second sensor arrangement 220b is arranged to transmit said first set of identified objectsand said associated data to the first control unit 200. Said first control unit 200 can bearranged to control operation of said second sensor arrangement 220b. Said first control unit200 is arranged for communication with said second sensor arrangement 220b via a linkL220b. Said first control unit 200 is arranged to receive information from said second sensor arrangement 220b.
The third sensor arrangement 220c is arranged to transmit said first set of identified objectsand said associated data to the first control unit 200. Said first control unit 200 can bearranged to control operation of said third sensor arrangement 220c. Said first control unit200 is arranged for communication with said third sensor arrangement 220c via a link L220c.Said first control unit 200 is arranged to receive information from said third sensor arrangement 220c.ln one example, the transmission is performed over a CAN-bus of the vehicle.
The first control unit 200 is arranged to combine the identified objects from the first, second,and third set of identified objects based on the associated data. The first control unit 200 isarranged to perform the combining so that same objects between the different sets ofidentified objects are identified. The first control unit 200 is arranged to perform the combining so that a common set of objects is provided.
The plurality of sensor arrangements 220a, 220b, 220c, and the first control unit 200 arepreferably arranged to repeatedly perform the actions that they are arranged for. As anexample, the sensing and identifying, the transmitting and the combining can be performedapproximately every 17 milliseconds. Any other time period for the repetition is possible as well. 11The first control unit 200 is arranged to, when combining said identified objects from the first,second, and third set of identified objects, first combining identified objects out of the first,second, and third set whose va|ue(s) of said at least one non-kinematic attribute had beencombined previously and had been identified as same objects before combining otheridentified objects from said first, second and third set. This is explained in more detail in relation to Fig. 3b.
So far only three sensor arrangements have been described. ln general, however, more thanthe described three sensor arrangements are possible to be part of the plurality of sensorarrangements 220a, 220b, 220c, Any of the described first, second, or third sensors could beprovided several times. As an example, the sensor fusion system 299 could be provided with aleft side radar and a right side radar. Both the left and the right side radar could be arrangedto operate as the described second sensor arrangement. Even several cameras, several lidars,or the like can be provided. Even other types of sensor arrangements can be provided as partof the plurality of sensor arrangements. Examples of such sensor arrangements are sensorsarranged to receive information from other vehicles or from infrastructure. ln one example, asensor for receiving vehicle-to-vehicle communication is provided. Other vehicle can sendinformation regarding their position, for example achieved via a GPS-receiver, and/or theirtype of vehicle to the sensor. Another example is a sensor for receiving infrastructure-to-vehicle communication. As an example, the sensor could be arranged to receive information regarding road signs, such as speed limits, status of a traffic light, or the like.
The first control unit 200 is in one example arranged to transmit information regarding thecommon set of objects to a device array 270. Said first control unit 200 can be arranged tocontrol operation of said device array 270. Said first control unit 200 is arranged forcommunication with said device array 270 via a link L270. Said first control unit 200 can be arranged to receive information from said device array 270.
The device array 270 is arranged to receive the transmitted information form the first controlunit 200. The device array 270 can comprise one or more devices 271-276, ln one example,the device array 270 comprises an adaptive cruise control device 271. ln one example, thedevice array 270 comprises a warning system for pedestrians and/or bicyclists 272. ln one example, the device array 270 comprises a blind spot warning device 273. ln one example, the 12device array 270 comprises an automatic or assisted queue driving system 274. ln oneexample, the device array 270 comprises an automatic or assisted overtaking system 275. lnone example, the device array 270 comprises an automatic or assisted reversing system 276.Said received information can then be provided as input information to any of the device(s) inthe device array 270. ln principle, any device or system which needs information regardingsurrounding objects of the vehicle can be part of the device array 270. ln one example, a screen is part of the device array 270.
A second control unit 205 is arranged for communication with the first control unit 200 via alink L205 and may be detachably connected to it. lt may be a control unit external to thevehicle 100. lt may be adapted to conducting the innovative method steps according to theinvention. The second control unit 205 may be arranged to perform the inventive methodsteps according to the invention. lt may be used to cross-load software to the first control unit200, particularly software for conducting the innovative method. lt may alternatively bearranged for communication with the first control unit 200 via an internal network on boardthe vehicle. lt may be adapted to performing substantially the same functions as the firstcontrol unit 200, such as adapting the control of the gas engine in a vehicle. The innovativemethod may be conducted by the first control unit 200 or the second control unit 205, or by both of them.
The sensor fusion system 299 can perform any of the method steps described later in relation to Fig. 4.
Fig. 3a shows, in a schematic way, an example of sensor fusion according to prior art. Whenreferring to matching of objects in the description of Fig. 3a and 3b, this relates to the fusionof the sensor data relating to this object. Two sets of identified objects are provided. A first setof identified objects comprises identified objects 300-305. The first set is indicated by circlesfilled with black colour. A second set of identified objects comprises identified objects 310-315. The second set is indicated by circles filled with white colour. An object of sensor fusion isto combine the two sets of identified objects so that a common set of identified objects iscombined. Especially the same physical objects between the identified objects in the first and the second set have to be found. 13 According to a prior art solution, a first identified object 300 of the first set is tried to match toall identified objects 310-315 of the second set. This is indicated by continuous lines. A secondidentified object 301 of the first set is tried to match to all identified objects 310-315 of thesecond set. This is indicated by dashed lines. This procedure continuous until all objects in thefirst set and the second set are checked against each other whether they refer to the samephysical object. ln the shown example, a last identified object 305 of the first set is tried tomatch to all identified objects 310-315 of the second set. This is indicated by dotted lines. Theother identified objects ofthe first set are also tried to match to the identified objects of thesecond set. However, the lines indicating these matchings have been omitted for notoverloading the figure. The matching could, for example, include calculating the probabilitythat an object from the first set refers to the same physical object as an object from thesecond set. The two objects with the highest probability of matching are then matchedtogether. Some thresholds or the like can be used to avoid matching objects which in realitydo not correspond to the same physical object and which have not been identified in both SelIS.
Fig. 3b shows, in a schematic way, an example of sensor fusion according to the presentinvention. Two sets of identified objects are provided. A first set of identified objectscomprises identified objects 300-305. The first set is indicated by circles filled with blackcolour. A second set of identified objects comprises identified objects 310-315. The second setis indicated by circles filled with white colour. The identified objects can be the same as in theprior art solution. However, all identified objects will have a respective NKA associated tothem, for example an ID-tag. As an example, the identified objects 300-305 of the first setcould have A1, A5, A13, A3, A127, and A21 as values for the respective ID-tag. As an example,the identified objects 310-315 of the second set could have B3, B69, B338, B15, B7, and B28 asvalues for the respective ID-tag. The values for the NKA are provided independently by thetwo sensor arrangements, i.e. the first sensor arrangement providing the first set of identifiedobjects does in general not ”know” what values for the NKA are provided by the second SGFISOF arrangement, and VlCE VEFSG.
The sensor providing the first set and the sensor providing the second set are so-called smartsensors or so-called tracking sensors. The sensor arrangement providing the first set is arranged to provide the same value of the NKA to the same sensed and identified physical 14 object every time a sensing and identifying process is performed. The sensor arrangementproviding the second set is arranged to provide the same value of the NKA to the same sensedand identified physical object every time a sensing and identifying process is performed. As anexample, in case a specific pedestrian is identified as the first identified object 300 in the firstset and the value Alis associated to that object, the first sensor arrangement is arranged toassociate the value A1 again to the object representing the pedestrian next time the sensorarrangement senses and identifies objects, i.e. at the next run of a method according to thepresent disclosure. At the next run, however, the specific pedestrian does not need to be thefirst identified object but is, for example, the third identified object. ln this case the value A1 of the ID-tag will be associated to the third identified object at the next run.
The same applies to the second sensor arrangement. As an example, the same specificpedestrian as discussed in connection to the first sensor arrangement is identified as the third object and receives, as an example, B338 as value for the ID-tag.
When trying to match the objects from the two sets, the fusion centre arrangement will firstanalyse whether an object with a first value of the NKA of the first set has been matched to anobject with a second value of the NKA of the second set before, i.e. at a previous run of themethod. As an example, the fusion centre arrangement will receive A1 as the value for the ID-tag for the first identified object of the first set. The fusion centre arrangement will thenanalyse whether A1 was matched to a value for the ID-tag in the second set before. The fusioncentre arrangement might find that A1 has been matched to B338 before and will then findthat B338 belongs to the third identified object of the second set. The fusion centre will thenmatch these two objects before matching any objects which have no previously used ”partner” in the second set.
As a result, instead of calculating possibilities with all identified objects in the second set andthen deciding what objects to match, which is a computationally intense task, the only thingwhich has to be done is to compare values for ID-tags, which is a computationally easy task and can be performed relatively fast. ln Fig. 3b it is thus indicated by the only continuous line that the first object of the first set canbe matched directly to the third object of the second set. As an example, the second object of the first set, having the value A5 as ID-tag, might previously have been matched to an object of the second set having B69 as value of the ID-tag. ln that case, the fusion centrearrangement identifies that the second object in the second set now has B69 as value for theID-tag and will then match the second object from the first set with the second object fromthe second set. This is indicated by the only dashed line. After all the objects of the first setwhich have been matched with an object of the second set before have been matched againbecause an object with the corresponding value of the NKA is still present in the second set,the remaining objects are tried to match. This is indicated for the last object of the first set by the dotted line.
Fig. 4 shows, in a schematic way, a flow chart over an example of a method 400 fordecentralised sensor fusion in a vehicle according to the present invention. Not all steps of thedescribed method are necessary. lnstead, the method is presented in a way in which it mightbe implemented in a vehicle. However, several of the steps are optional and can thus beomitted in other examples of the method. Which steps are optional will in general depend on the specific implementation on a specific vehicle.
The method starts with a step 410 of sensing and identifying objects. Said sensing can beperformed by camera arrangements, radar arrangements, lidar arrangements or the like asdescribed in relation to Fig. 2. Said sensing can also be performed by receiving informationwhich is send out from other vehicles or infrastructure as described in relation to Fig. 2. Step410 is performed independently by a plurality of sensor arrangements. Each sensorarrangement is arranged to identify objects based on the data it receives through the sensing.The sensing and identifying of the objects is performed by each sensor arrangement in such away that a set of identified objects is provided. Step 410 can be performed repeatedly. The method continues with step 420. ln step 420 data is associated to the identified objects in the set of objects. This is performedindependently by each sensor for the respective set of identified objects. Said associated datacan comprise position and velocity of the objects as described before. Said associated datacomprises value(s) of at least one non-kinematic attribute, NKA. Said value(s) of the at leastone NKA is/are independently provided by each sensor arrangement. Said value(s) of said at least one NKA is/are intended to remain the same for the same object identified by the same 16 sensor arrangement during consecutive runs of the method 400. As an example, in case a firstcar is identified by a first sensor arrangement, it is intended that the same value of the NKAalways is associated to said first car. This can be achieved in a number of different ways. lncase the first car is adapted to send a unique ID-number, the sensor arrangement can bearranged to analyse the ID-numbers of other cars and always associate the same value of theNKA to cars sending out the same ID-number. ln another example, a sensor arrangementcomprises a camera which provides images. The provided images can be analysed by thesensor arrangement and values for the NKA can be associated to each identified object in theimage. The sensor arrangement typically is arranged to repeat step 410 in the order of sometens of milliseconds. Typical objects surrounding a vehicle will not move very far during such atime period. Step 420 can be performed repeatedly. The sensor arrangement can then bearranged to associate the same value for the NKA to each identified object which is basicallyon the same place in the provided image on a next run of the method 400. There are moreexamples known in the art how sensor arrangements can keep track of identified objects.Basically any such sensor arrangement can be used to associate the same value of the NKA to the same identified object in a next run of step 420. The method continues with step 430. ln step 430 the set of identified objects and the associated data is transmitted to a fusioncentre arrangement. Said fusion centre arrangement can be an electronic control unit asdescribed in relation to Fig. 2. Said fusion centre arrangement can be an existing electroniccontrol unit in the vehicle. Said electronic control unit can be situated at a distance from thesensor arrangement. ln one example, said fusion centre arrangement is implemented at onesensor arrangement out of the plurality of sensor arrangements. The method continues withstep 440. All of the following steps are preferably performed by the fusion centre arrangement. ln step 440 it is analysed which value(s) of said at least one non-kinematic attribute of theidentified objects out of the different sets had been combined previously and had been identified as same objects. This can be done in the following way: For an object in a first set of identified objects the value(s) of the NKA is/are determined.These value(s) of the NKA can be denoted first value(s) of the NKA. lt is analysed whether that object of the first set previously has been identified as representing the same physical object 17 as an identified object in any of the other sets of identified objects. This can be done byanalysing, for example in a look-up table, whether said first value(s) of the NKA had previouslybeen combined with second value(s) of the NKA attributed to identified objects in any otherset of identified objects. lt can then be analysed whether any of said second value(s) of theNKA are present in any other of the present sets of identified objects. lf this is the case, it canbe concluded that the object(s) in the other sets of identified objects having said secondvalue(s) ofthe NKA still relate to the same physical object. For the objects of the different setsof identified objects which have been identified as representing the same physical object themethod continues with step 470. ln one example, step 490 is performed before continuingwith step 470 for the objects of the different sets of identified objects which have beenidentified as representing the same physical object. The above described can be done for all identified objects of all sets of identified objects. ln one example, it is analysed whether objects of the different sets of identified objects havebeen identified as representing the same physical object for a pre-determined number oftimes. As an example, said pre-determined number oftimes can be two, three, four, five, ormore times. Only if these objects have been analysed as representing the same physical objectfor at least said pre-determined number oftimes the method continues with step 470 forthese objects. This assures that accidental identification of identifying objects as representingthe same physical objects during one or a few runs of the method will not lead to the fact thatthese objects will accidentally be analysed as the same objects when continuously performingthe method 400. ln case identified objects have been identified as representing the samephysical object for a pre-determined number of times it can be concluded that the identification as the same physical object likely is not accidental.
The method will continue with step 450 for the objects which do not continue with step 490 or step 470. ln step 450 the data relating to the remaining objects is gated. This is performed in accordanceto any gating process in combination to sensor fusion which is known in the art. The method continues with step 460. ln step 460 an association algorithm is performed. This algorithm is intended to find identified objects representing the same physical object among the remaining objects in the sets of 18identified objects. Such association algorithms are well known in the art. The method continues with step 470. ln one example, the order of the steps 460 and 470 is reversed. ln step 470 data relating to the same physical object is fused. The process of data fusion fordata relating to the same physical object is known in the art. That data relates to the samephysical object is known either from step 460 or from step 440, or from step 490, respectively.
The method continues with step 480. ln the optional step 490 it is analysed whether it is plausible that identified objects from atleast two sets of identified objects really represent the same physical object. ln other words itis analysed whether the identification as the same physical object based on the value(s) ofNKA is reasonable. Such a plausibility test can be performed in a number of ways. ln oneexample, it is analysed whether the identified objects have approximately the same velocity.ln one example, it is analysed whether the identified objects have approximately the sameposition. ln one example, it is analysed whether the identified objects are of the same type,have approximately the same colour, approximately the same size, and/or have approximatelythe same values for any other parameter. A plausibility test can also comprise a combinationof any of the above examples. When referring to the term approximately it should beunderstood that every sensor arrangement has a certain degree of uncertainty in determiningdata or parameter relating to the identified objects. As an example, the determined velocitywill have a certain degree of uncertainty. The degree of uncertainty can differ betweendifferent sensor arrangements. ln one example, it is determined that it is plausible thatidentified objects represent the same physical objects if the values of a parameter including itsuncertainty from different sensor arrangements at least partly overlap. As an example, a firstsensor arrangement could have determined the velocity of an object as 16 km/hi4 km/h and asecond sensor arrangement could have determined the velocity of an object as 22 km/h i 3km/h. lt can then be determined that the values for the velocities partly overlap whenincluding the uncertainty and that it therefore is plausible that the identified objects represent the same physical object. More complex algorithms known in the art can be used as well. ln case it is determined that it is plausible that identified objects from at least two sets ofidentified objects really represent the same physical object the method continues with step 470 for these objects. ln case it is determined that it is not plausible that identified objects 19from at least two sets of identified objects really represent the same physical object the method continues with step 450 for these objects. ln step 480 a track management procedure is performed. This track management procedurecan include storing information regarding which of the identified objects from the differentsets of identified objects have been determined to represent the same physical object. Thistrack management procedure can include storing the values of the NKA for the identifiedobjects which have been determined to represent the same physical object. ln one examplethe method 400 ends after step 480. ln a preferred example, the method 400 starts again. In a preferred example the method 400 is performed repeatedly during operation of the vehicle.
Figure 5 is a diagram of one version of a device 500. The control units 200 and 205 describedwith reference to Figure 2 may in one version comprise the device 500. The device 500comprises a non-volatile memory 520, a data processing unit 510 and a read/write memory550. The non-volatile memory 520 has a first memory element 530 in which a computerprogram, e.g. an operating system, is stored for controlling the function of the device 500. Thedevice 500 further comprises a bus controller, a serial communication port, I/O means, an A/Dconverter, a time and date input and transfer unit, an event counter and an interruptioncontroller (not depicted). The non-volatile memory 520 has also a second memory element 540.
The computer program comprises routines for controlling a gas engine, wherein the gasengine is supplied with a fuel gas which consists of different kinds of molecules and which is stored in at least a gaseous phase and a liquid phase in a gas storage device.
The computer program P may comprise routines for decentralised sensor fusion in a vehicle.This may at least partly be performed by means of said first control unit 200 and said pluralityof sensor arrangement 220a, 220b, 220c, The computer program P may comprise routinessensing and identifying objects with a plurality of sensor arrangements. The computerprogram P may comprise routines for associate data to said identified objects in the sets ofidentified objects. Said associated data comprises value(s) of at least one non-kinematic attribute. This might at least partly be performed by said plurality of sensor arrangements 220a, 220b, 220c, Said associated data, especially said va|ue(s) for the NKA can be stored in said non-volatile memory 520.
The computer program P may comprise routines for combining the identified objects from allsaid respective sets of identified objects based on said associated data so that same objectsbetween the different respective sets of identified objects are identified and so that acommon set of objects is provided. This may at least partly be performed by means of said first control unit 200.
The computer program P may comprise routines first combining identified objects out ofdifferent sets of identified objects whose va|ue(s) of said at least one non-kinematic attributehad been combined previously and had been identified as same objects before combiningother identified objects from said respective sets. This may at least partly be performed bymeans of said first control unit 200, for example based on accessing stored values for the NKA from said non-volatile memory 520.
The computer program P may comprise routines for determining whether objects have beenidentified as the same object for a pre-determined number of previous runs of the method.
This may at least partly be performed by means of said first control unit.
The computer program P may comprise routines for determining whether an identification ofobjects as the same physical object is plausible. This may at least partly be performed by means of said first control unit 200.
The program P may be stored in an executable form or in compressed form in a memory 560 and/or in a read/write memory 550.
Where it is stated that the data processing unit 510 performs a certain function, it means thatit conducts a certain part of the program which is stored in the memory 560 or a certain part of the program which is stored in the read/write memory 550.
The data processing device 510 can communicate with a data port 599 via a data bus 515. Thenon-volatile memory 520 is intended for communication with the data processing unit 510 viaa data bus 512. The separate memory 560 is intended to communicate with the data processing unit via a data bus 511. The read/write memory 550 is arranged to communicate 21with the data processing unit 510 via a data bus 514. The links L205, L220, L240, L250, and L270, for example, may be connected to the data port 599 (see Figure 2).
When data are received on the data port 599, they can be stored temporarily in the secondmemory element 540. When input data received have been temporarily stored, the data processing unit 510 can be prepared to conduct code execution as described above.
Parts of the methods herein described may be conducted by the device 500 by means of thedata processing unit 510 which runs the program stored in the memory 560 or the read/write memory 550. When the device 500 runs the program, methods herein described are executed.
The foregoing description of the preferred embodiments of the present invention is providedfor illustrative and descriptive purposes. lt is neither intended to be exhaustive, nor to limitthe invention to the variants described. Many modifications and variations will obviouslysuggest themselves to one skilled in the art. The embodiments have been chosen anddescribed in order to best explain the principles of the invention and their practicalapplications and thereby make it possible for one skilled in the art to understand the inventionfor different em bodiments and with the various modifications appropriate to the intended USE. lt should especially be noted that the system according to the present disclosure can bearranged to perform any of the steps or actions described in relation to the method 400. ltshould also be understood that the method according to the present disclosure can furthercomprise any of the actions attributed to an element of the sensor fusion system 299described in relation to Fig. 2. The same applies to the computer program and the computer program product.

Claims (1)

1. A method (400) for decentralised sensor fusion in a vehicle, the method comprísing the steps of: - sensing and identifying (410) objects with a plurality of sensor arrangements,wherein said sensing and identifying is performed independently by eachsensor arrangement out of said plurality of sensor arrangements so that eachsensor arrangement provides a respective set of identified objects; - for each sensor arrangement out of said plurality of sensor arrangements:associating (420), by said sensor arrangement, data to said identified objects insaid respective set of identified objects; - combining (440-470) the identified objects from all said respective sets ofidentified objects based on said associated data so that same objects betweenthe different respective sets of identified objects are identified and so that acommon set of objects is provided; characterised in that said associated data comprises va|ue(s) of at least one non-kinematic attribute which is provided by the corresponding sensor arrangement,wherein said va|ue(s) of said at least one non-kinematic attribute is/are intended toremain the same for the same object identified by the same sensor arrangementduring consecutive runs of the method, and characterised in that the step ofcombining said identified objects from all said respective sets of identified objectscomprises first combining (440; 470) identified objects out of different respectivesets whose va|ue(s) of said at least one non-kinematic attribute had been combinedpreviously and had been identified as same objects before combining (450-470)other identified objects from said respective sets. The method according to the previous claim, wherein said non-kinematic attribute isan ID-tag. The method according to anyone of the previous claims, wherein said method isrepeatedly performed during operation of the vehicle. The method according to anyone of the previous claims, wherein said first combiningof identified objects out of different respective sets whose va|ue(s) of said at least one non-kinematic attribute had been combined previously and had been identified 23 as same objects is only performed for objects which have been identified as thesame object for a pre-determined number of previous runs of the method. The method according to anyone of the previous claims, wherein said first combiningof identified objects out of different respective sets whose va|ue(s) of said at leastone non-kinematic attribute had been combined previously and had been identifiedas same objects is only performed for objects for which a plausibility check (490)indicates that identifying these objects as the same object again is plausible. A sensor fusion system (299), said system comprising: - a plurality of sensor arrangements (220a, 220b, 220c, ...), wherein each sensorarrangement out of the plurality of sensor arrangements (220a, 220b, 220c, ...)is arranged to independently sense and identify objects, to provide a respectiveset of identified objects, to associate data to said identified objects in saidrespective set of identified objects, and to transmit said respective set ofidentified objects and said associated data to a fusion centre arrangement (200,205); - the fusion centre arrangement (200, 205), being arranged to combine theidentified objects from all said respective sets of identified objects based onsaid associated data so that same objects between the different respective setsof identified objects are identified and so that a common set of objects isprovided; characterised in that said associated data comprises va|ue(s) of at least one non-kinematic attribute, wherein said va|ue(s) of said at least one non-kinematicattribute is/are intended to remain the same for the same object identified by thesame sensor arrangement during consecutive providing of a respective set ofidentified objects and associating data to said identified objects in said respective setof identified objects by each sensor arrangement, and characterised in that thefusion centre arrangement (200, 205) is arranged to, when combining said identifiedobjects from all said respective sets of identified objects, first combining identifiedobjects out of different respective sets whose va|ue(s) of said at least one non-kinematic attribute had been combined previously and had been identified as same objects before combining other identified objects from said respective sets. 10. 11. 12. 24 The system according to the previous claim, wherein said non-kinematic attribute isan ID-tag. The system according to anyone of the claim 6-7, wherein said fusion centrearrangement (200; 205) is arranged to first combine the identified objects out ofdifferent respective sets whose va|ue(s) of said at least one non-kinematic attributehad been combined previously and had been identified as same objects only forobjects which have been identified as the same object for a pre-determined numberof times. The system according to anyone ofclaim 6-8, wherein the fusion centre arrangement(200; 205) further is arranged to perform plausibility checks and to perform said firstcombining of identified objects out of different respective sets whose va|ue(s) of saidat least one non-kinematic attribute had been combined previously and had beenidentified as same objects only for objects for which the plausibility check indicatesthat identifying these objects as the same object again is plausible. A vehicle (100) comprising a sensor fusion system (299) according to anyone of claim6-9. A computer program (P) for decentralised sensor fusion in a vehicle, wherein saidcomputer program (P) comprises program code for causing an electronic control unit(200; 500) or a computer (205; 500) connected to the electronic control unit (200;500) to perform the steps according to any of the claims 1-5. A computer program product containing a program code stored on a computer-readable medium for performing method steps according to any of claims 1-5, whensaid computer program is run on an electronic control unit (200; 500) or a computer (205; 500) connected to the electronic control unit (200; 500).
SE1650719A 2016-05-25 2016-05-25 Method for decentralised sensor fusion in a vehicle and sensor fusion system SE1650719A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE1650719A SE1650719A1 (en) 2016-05-25 2016-05-25 Method for decentralised sensor fusion in a vehicle and sensor fusion system
PCT/SE2017/050495 WO2017204719A1 (en) 2016-05-25 2017-05-15 Method for decentralised sensor fusion in a vehicle and sensor fusion system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1650719A SE1650719A1 (en) 2016-05-25 2016-05-25 Method for decentralised sensor fusion in a vehicle and sensor fusion system

Publications (1)

Publication Number Publication Date
SE1650719A1 true SE1650719A1 (en) 2017-11-26

Family

ID=60411762

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1650719A SE1650719A1 (en) 2016-05-25 2016-05-25 Method for decentralised sensor fusion in a vehicle and sensor fusion system

Country Status (2)

Country Link
SE (1) SE1650719A1 (en)
WO (1) WO2017204719A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113167892A (en) * 2019-02-06 2021-07-23 宝马股份公司 Method and device for sensor data fusion of a vehicle

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110378178B (en) * 2018-09-30 2022-01-28 毫末智行科技有限公司 Target tracking method and device
DE102019205504A1 (en) * 2019-04-16 2020-10-22 Zf Friedrichshafen Ag Control device and method as well as computer program product

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10133945A1 (en) * 2001-07-17 2003-02-06 Bosch Gmbh Robert Method and device for exchanging and processing data
US7283938B1 (en) * 2004-04-15 2007-10-16 Lockheed Martin Corporation Virtual sensor for data and sensor fusion
US8010658B2 (en) * 2007-02-09 2011-08-30 Raytheon Company Information processing system for classifying and/or tracking an object
US8805648B2 (en) * 2011-10-05 2014-08-12 Ats Group (Ip Holdings) Limited Data fusion in high computational load environments

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113167892A (en) * 2019-02-06 2021-07-23 宝马股份公司 Method and device for sensor data fusion of a vehicle

Also Published As

Publication number Publication date
WO2017204719A1 (en) 2017-11-30

Similar Documents

Publication Publication Date Title
KR101942109B1 (en) Method and system for validating information
US10839694B2 (en) Blind spot alert
US10410513B2 (en) Fusion of non-vehicle-to-vehicle communication equipped vehicles with unknown vulnerable road user
US9165198B2 (en) Method for identifying a vehicle during vehicle-to-vehicle communication
US10227813B2 (en) Device and method for opening trunk of vehicle, and recording medium for recording program for executing method
JP2020107324A5 (en)
US20210031783A1 (en) Information processing system, information processing device, information processing method, and non-transitory computer readable storage medium storing program
SE1650719A1 (en) Method for decentralised sensor fusion in a vehicle and sensor fusion system
CN113287073A (en) Automatic driving automobile simulator using network platform
EP3372465B1 (en) Method and system for vehicle status based advanced driver assistance
US11897511B2 (en) Multi-hypothesis object tracking for automated driving systems
US11847562B2 (en) Obstacle recognition assistance device, obstacle recognition assistance method, and storage medium
CN110244306A (en) Passive sound location classification and positioning
CN109720352A (en) Vehicle drive auxiliary control method and equipment
CN112649809A (en) System and method for fusing sensor data in a vehicle
CN109703555A (en) Method and apparatus for detecting object shielded in road traffic
US20240103132A1 (en) Radar apparatus and method for classifying object
CN114026624B (en) Recognition of objects by far infrared camera
US20170092121A1 (en) Method and System for Determining and Using Property Associations
US11169253B2 (en) Discriminate among and estimate velocities of multiple objects using multi-node radar system
US20210081669A1 (en) Event-based identification and tracking of objects
US20230147100A1 (en) Clustering Track Pairs for Multi-Sensor Track Association
KR102391173B1 (en) Radar sensor R&D method with artificial intelligence machine learning
US20180093605A1 (en) Methods and systems for unidirectional and bidirectional communications
US20200211210A1 (en) Intersection of point cloud and image to determine range to colored light sources in vehicle applications

Legal Events

Date Code Title Description
NAV Patent application has lapsed