CN109583505A - A kind of object correlating method, device, equipment and the medium of multisensor - Google Patents

A kind of object correlating method, device, equipment and the medium of multisensor Download PDF

Info

Publication number
CN109583505A
CN109583505A CN201811481180.4A CN201811481180A CN109583505A CN 109583505 A CN109583505 A CN 109583505A CN 201811481180 A CN201811481180 A CN 201811481180A CN 109583505 A CN109583505 A CN 109583505A
Authority
CN
China
Prior art keywords
track data
data collection
sensor
fusion
matched
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811481180.4A
Other languages
Chinese (zh)
Inventor
徐铎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201811481180.4A priority Critical patent/CN109583505A/en
Publication of CN109583505A publication Critical patent/CN109583505A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses object correlating method, device, equipment and the media of a kind of multisensor, which comprises at least one the fusion track data collection for obtaining at least one being currently generated sensor track data collection to be matched and having saved;Wherein, respectively fusion track data collection includes at least one existing sensor track data collection;Determine sensor track data collection to be matched with merge the similarity between track data collection;According to similarity definitive result, the incidence relation between the corresponding object of sensor track data collection to be matched object corresponding with fusion track data collection is determined;Through the above technical solutions, improving the association accuracy and association efficiency between the same object from different sensors.

Description

A kind of object correlating method, device, equipment and the medium of multisensor
Technical field
The present embodiments relate to Data Fusion of Sensor technology more particularly to a kind of object affiliated parties of multisensor Method, device, equipment and medium.
Background technique
Data Fusion of Sensor is a core topic of current robot and its relevant technologies.In actual environment only The whole metrical informations for wanting the object of measurement can not be accurately obtained by single-sensor.Different sensors are in difference The short length of aspect mutual.Some sensors such as video camera can accurately identify the type of object, while the visual field is remote enough, but It is the position for object, the measurement effect of the information such as speed is then poor.Laser radar sensor then has pole to object space High measurement accuracy, but due to the characteristic of sensor itself, can only at most accomplish that 128 lines, a line can only measure at present A plane in 3d space, causes a cloud sparse, it is contemplated that can identify the accuracy rate of object, therefore effectively measure distance It is significantly less than sensor itself and measures distance.And millimetre-wave radar then aligns the tachometric survey of object according to Doppler effect Really, but it can only obtain the 2D location information of object.So by the obtained data knot for the same object of different sensors Obtain altogether each attribute an of object precise information just become extremely it is necessary to.Sensor fusion algorithm is exactly purport Solving the problems, such as this kind.The primary data information (pdi) that then fusion exactly obtains each sensor is by after preliminary treatment The object of abstract chemical conversion one by one, each sensor obtains each object information in the same space, for example, in current nothing The vehicle and pedestrian information in road ahead can be then obtained in the perception of people's vehicle.The association algorithm merged afterwards is that the same object exists Object information obtained in different sensors sets up the algorithm of one-to-one relationship.Fusion association algorithm is melted after being based on afterwards The first step of the sensory perceptual system of conjunction, and the entire rear foundation stone merged.If mistake occurs in association, subsequent speed is melted It closes, mistake can all occur in Co-factor propagation etc..Therefore improve after merge association algorithm accuracy become pole it is necessary to.
The rear fusion association algorithm of current mainstream is then to provide not simultaneous interpretation according to this feature by constructing certain feature Similarity between the object that sensor obtains under certain condition metric.Later according to global similarity maximum principle to not simultaneous interpretation The similarity of the different objects of sensor is cumulative, and the maximum result of overall similarity is exactly to merge associated result.
The feature of fusion association algorithm building is generally following several afterwards:
The position of testee.The position for the same object that different sensors obtain theoretically should be identical. The distribution of the standard deviation of measurement according to measurement sensor is available under current measuring condition, and two objects are the same objects The probability of body.
The speed of testee.The speed of testee include speed direction and size, according to the similarity of vector come Calculate the likelihood probability of two objects.
Projection of the point cloud of the laser radar of testee on camera.It calculates point cloud and is projected in the point in some detection block The percentage of Yun Zhandian cloud sum determines similarity.
Projection of the position that the millimetre-wave radar of testee obtains on camera.Calculate the position the 3D frame that millimeter wave obtains In the projection of camera plane and the ROI (hand over and compare) of image detection frame.
In the prior art, sensor-based frame data calculates object similarity, according to object similarity to not simultaneous interpretation The object of sensor perception is matched, and merges association results after obtaining.The information obtained due to frame data is largely all The positional relationship that can exist between more or less noise and sensor is changed as the variation of time has uncertainty, System deviation or random noise are thus introduced, and existing rear fusion association algorithm can all design a threshold value, once it is similar Degree just will not add it among matching algorithm lower than this threshold value, but since the presence of noise will lead to similarity on side The matching of edge floats up and down with the variation of noise in threshold value, and then the appearance that the matching will not be stable, after eventually leading to entirety The reduction of blending algorithm performance.And when using different features, between feature not in the same metric space, therefore not Weight with feature is very difficult to determine.
As it can be seen that the side that the object that fusion association algorithm perceives different sensors after using in the prior art is associated Case, accuracy are lower.
Summary of the invention
The present invention provides object correlating method, device, equipment and the medium of a kind of multisensor, comes from not simultaneous interpretation to improve Association accuracy and association efficiency between the same object of sensor.
In a first aspect, the embodiment of the invention provides a kind of object correlating methods of multisensor, which comprises
At least one for obtaining at least one the sensor track data collection to be matched being currently generated and having saved is melted Close track data collection;Wherein, respectively fusion track data collection includes at least one existing sensor track data collection;
Determine sensor track data collection to be matched with merge the similarity between track data collection;
According to similarity definitive result, determine the corresponding object of sensor track data collection to be matched with merge track number According to the incidence relation collected between corresponding object;
Wherein, each sensor track data collection includes the sensing data of the setting number continuously exported according to same sensor The characteristic for the same object that frame determines respectively, object corresponding to different sensors track data collection is different, same fusion The identical and corresponding sensor of object corresponding to each sensor track data collection that track data collection includes is different, and Object corresponding to difference fusion track data collection is different.
Second aspect, the embodiment of the invention also provides a kind of object associated apparatus of multisensor, described device includes:
Module is obtained, for obtaining at least one the sensor track data collection to be matched being currently generated and having saved At least one fusion track data collection;Wherein, respectively fusion track data collection includes at least one existing sensor track number According to collection;
Similarity determining module, for determine sensor track data collection to be matched with merge between track data collection Similarity;
Incidence relation determining module, for determining sensor track data collection to be matched according to similarity definitive result Incidence relation between corresponding object object corresponding with fusion track data collection;
Wherein, each sensor track data collection includes the sensing data of the setting number continuously exported according to same sensor The characteristic for the same object that frame determines respectively, object corresponding to different sensors track data collection is different, same fusion The identical and corresponding sensor of object corresponding to each sensor track data collection that track data collection includes is different, and Object corresponding to difference fusion track data collection is different.
The third aspect, the embodiment of the invention also provides a kind of electronic equipment, the electronic equipment includes:
One or more processors;
Storage device, for storing one or more programs,
When one or more of programs are executed by one or more of processors, so that one or more of processing Device realizes the object correlating method of the multisensor as described in any in claim 1-11.
Fourth aspect, the embodiment of the invention also provides a kind of computer readable storage mediums, are stored thereon with computer Program, which is characterized in that realize when the program is executed by processor such as the multisensor as described in any in claim 1-11 Object correlating method.
The present invention is by obtaining at least one sensor track data collection to be matched for being currently generated and having saved At least one fusion track data collection;Wherein, respectively fusion track data collection includes at least one existing sensor track data Collection;Determine sensor track data collection to be matched with merge the similarity between track data collection;It is determined and is tied according to similarity Fruit determines the association between the corresponding object of sensor track data collection to be matched object corresponding with fusion track data collection Relationship, wherein each sensor track data collection includes the sensing data frame of the setting number continuously exported according to same sensor The characteristic of determining same object respectively, object corresponding to different sensors track data collection is different, same fusion rail The identical and corresponding sensor of object corresponding to each sensor track data collection that mark data set includes is different, and not With object corresponding to fusion track data collection, different technological means, is improved between the same object from different sensors Association accuracy and association efficiency.
Detailed description of the invention
Fig. 1 is the flow diagram of the object correlating method of one of the embodiment of the present invention one multisensor;
Fig. 2 is the flow diagram of the object correlating method of one of the embodiment of the present invention two multisensor;
Fig. 3 is the flow diagram of the object correlating method of one of the embodiment of the present invention three multisensor;
Fig. 4 is the structural schematic diagram of the object associated apparatus of one of the embodiment of the present invention four multisensor;
Fig. 5 is the structural schematic diagram of one of the embodiment of the present invention five electronic equipment.
Specific embodiment
The present invention is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched The specific embodiment stated is used only for explaining the present invention rather than limiting the invention.It also should be noted that in order to just Only the parts related to the present invention are shown in description, attached drawing rather than entire infrastructure.
Embodiment one
Fig. 1 is a kind of flow diagram of the object correlating method for multisensor that the embodiment of the present invention one provides, this reality It applies example to be applicable to merge based on the data that multiple sensors simultaneously detect multiple objects, to realize in the future The case where being associated from the data for same object of multiple sensors, this method can be associated with by the object of multisensor Device executes, which can realize that described device is generally integrated in terminal by the mode of software and/or hardware, the end End for example specifically can be server.A kind of flow diagram of the object correlating method of multisensor shown in Figure 1, institute Object correlating method is stated to specifically comprise the following steps:
Step 110, at least one sensor track data collection to be matched for being currently generated of acquisition and saved to A few fusion track data collection;Wherein, respectively fusion track data collection includes at least one existing sensor track data collection.
Wherein, each sensor track data collection includes the sensing data of the setting number continuously exported according to same sensor The characteristic for the same object that frame determines respectively, object corresponding to different sensors track data collection is different, same fusion The identical and corresponding sensor of object corresponding to each sensor track data collection that track data collection includes is different, and Object corresponding to difference fusion track data collection is different.The characteristic of the object for example can be speed data and/or Position data.
It illustrates the sensor track data collection to be matched being related in above-mentioned steps 110 and has saved at least The meaning of one fusion track data collection:
Assuming that configuration (is commonly configured on unmanned vehicle in practical application and is more than there are two different sensors on unmanned vehicle Two different sensors, the present embodiment is to configure there are two being illustrated for different sensors, when configured with more than two When a different sensors, process that the data from different sensors are merged with will illustrate to from two not The process that same sensing data is merged is consistent), described two different sensors are respectively camera and radar, at nobody Vehicle detects the driving scene of unmanned vehicle during travelling.Radar and camera are one frame sensing data of output per second Frame, when same sensor (radar or camera) continuously exports 5 frame sensing data frame, carry out an object features data determines behaviour Make, specifically the 5 frame sensing data frames that algorithm is continuously exported based on the same sensor can be determined using existing characteristic Determine the characteristic of same object.It starts to work assuming that radar shifts to an earlier date 1s than camera, the time started to work by radar is in terms of Shi Qidian, then camera is started to work from 2s, and in 5s, the sensing data frame that radar continuously exports reaches 5 frames, at this point, then Object features identification is carried out based on the 5 frame sensing data frames that radar continuously exports, it is assumed that is wrapped in the sensing data frame of radar output The detection data of object A and object B are included, then carries out object features identification by the 5 frame sensing data frames continuously exported to radar, The sensor track data collection of object A is obtained respectively, is labeled as STL-AAnd the sensor track data collection of object B, it is labeled as STL-B;However in 5s, the sensing data frame that camera continuously exports only has 4 frames, does not meet preset condition (5 frames of continuous output Sensing data frame) therefore, object features identification is not carried out to the sensing data frame of camera output.As 6s, camera is continuously defeated Sensing data frame out reaches 5 frames, at this point, carrying out feature identification to the 5 frame sensing data frames that camera continuously exports, it is assumed that camera Include the detection data of object 1 and object 2 in the sensing data frame of output, then senses number by 5 frames continuously exported to camera Object features identification is carried out according to frame, obtains the sensor track data collection of object 1 respectively, is labeled as STC-1And the biography of object 2 Sensor track data collection is labeled as STC-2.The object correlating method of multisensor provided in this embodiment is used for phase machine testing To object 1 respectively with detections of radar to object A and object B be associated, to determine in object 1 and object A and object B Which object belongs to the same object, and determines object 2 belongs to the same object with which object in object A and object B. Therefore the sensor track data collection ST of object A that above-mentioned radar obtainsL-AWith the sensor track data collection ST of object BL-BRespectively " the two fusion track data collection saved " in step 110 are formed, are denoted as fusion track data collection FT respectivelyAAnd FTB, In, merge track data collection FTAInclude the sensor track data collection ST about object A from radarL-A, merge track number According to collection FTBInclude the sensor track data collection ST about object B from radarL-B.Above-mentioned camera obtain about object 1 Sensor track data collection STC-1, and about the sensor track data collection ST of object 2C-2It is as current (6s when Carve) generate two sensor track data collection to be matched.
Step 120, determine sensor track data collection to be matched with merge the similarity between track data collection.
The example above is continued to use, that is, determines sensor track data collection ST to be matchedC-1And STC-2Respectively with merge rail Mark data set FTAAnd FTBBetween similarity.It specifically can be by calculating sensor track data collection STC-1With merge track data Collect FTAIn similarity between each sensor track data collection determine sensor track data collection ST to be matchedC-1With merge Track data collection FTABetween similarity, such as can be by sensor track data collection ST to be matchedC-1With merge track data Collect FTAIn between each sensor track data collection similarity average value as sensor track data collection ST to be matchedC-1With Merge track data collection FTABetween similarity.Sensor track data collection ST to be matchedC-1With merge track data collection FTB Between the calculation of similarity, sensor track data collection ST to be matchedC-2With merge track data collection FTABetween it is similar The calculation of degree and sensor track data collection ST to be matchedC-2With merge track data collection FTBBetween similarity meter Calculation mode and above-mentioned sensor track data collection ST to be matchedC-1With merge track data collection FTABetween similarity calculating side Formula is identical.
Step 130, according to similarity definitive result, determine the corresponding object of sensor track data collection to be matched and melt Close the incidence relation between the corresponding object of track data collection.
Specifically, by the maximum value in the corresponding object of sensor track data collection and similarity definitive result to be matched Incidence relation is established between the corresponding object of corresponding fusion track data collection.For example, it is assumed that above-mentioned sensor rail to be matched Mark data set STC-1With merge track data collection FTABetween similarity be 0.6, sensor track data collection ST to be matchedC-2 With merge track data collection FTBBetween similarity be 0.2, then similarity definitive result be 0.6+0.2=0.8;Meanwhile to The sensor track data collection ST matchedC-1With merge track data collection FTBBetween similarity be 0.3, sensor rail to be matched Mark data set STC-2With merge track data collection FTABetween similarity be 0.2, then similarity definitive result be 0.3+0.2= 0.5;Since 0.8 greater than 0.5, by sensor track data collection ST to be matchedC-1Corresponding object 1 with merge track number According to collection FTAIncidence relation is established between corresponding object A, by sensor track data collection ST to be matchedC-2Corresponding object 2 With merge track data collection FTBIncidence relation is established between corresponding object B.
The technical solution of the present embodiment, by obtaining at least one the sensor track data collection to be matched being currently generated And at least one fusion track data collection saved;Wherein, respectively fusion track data collection includes at least one existing biography Sensor track data collection;Determine sensor track data collection to be matched with merge the similarity between track data collection;According to Similarity definitive result determines the corresponding object of sensor track data collection to be matched object corresponding with fusion track data collection Incidence relation between body, wherein each sensor track data collection includes the setting number continuously exported according to same sensor The characteristic of same object that determines respectively of sensing data frame, object corresponding to different sensors track data collection is not Together, the identical and corresponding sensing of object corresponding to each sensor track data collection that same fusion track data collection includes Device is different, and the technological means that object corresponding to different fusion track data collection is different, improves from different sensors Same object between association accuracy and association efficiency.
Embodiment two
Fig. 2 is the flow diagram of the object correlating method of one of the embodiment of the present invention two multisensor, above-mentioned On the basis of embodiment technical solution, " step 120 determines sensor track data to be matched this gives above-mentioned Collection and merge the similarity between track data collection " a kind of specific embodiment, shown in Figure 2, the object affiliated party Method includes the following steps:
Step 210, at least one sensor track data collection to be matched for being currently generated of acquisition and saved to A few fusion track data collection;Wherein, respectively fusion track data collection includes at least one existing sensor track data collection.
Step 220, for each sensor track data collection to be matched, determine current sensor rail to be matched respectively Each similarity merged between track data collection of mark data set and the condition that meets.
Wherein, sensing corresponding to each sensor track data collection that the fusion track data collection for meeting condition includes Device is different from sensor corresponding to current sensor track data collection to be matched.For example, it is assumed that current sensing to be matched Device track data collection is the sensor track data collection ST for object 1 that above-mentioned camera detectsC-1, due to above-mentioned fusion rail Mark data set FTAThe sensor track data collection ST for includingL-ACorresponding sensor is radar, with current sensing to be matched Device track data collection STC-1Corresponding camera sensor is different, therefore above-mentioned fusion track data collection FTAFor current biography to be matched Sensor track data collection STC-1The fusion track data collection for meeting condition.Similarly, due to above-mentioned fusion track data collection FTBPacket The sensor track data collection ST containedL-BCorresponding sensor is radar, with current sensor track data collection to be matched STC-1Corresponding camera sensor is different, therefore above-mentioned fusion track data collection FTBIt also is current sensor track number to be matched According to collection STC-1The fusion track data collection for meeting condition.
Illustratively, the current sensor track data collection to be matched of the determination merges track number with each of the condition that meets According to the similarity between collection, comprising:
It calculates separately current sensor track data collection to be matched and merges track data Ji Bao with any of the condition that meets The similarity between each sensor track data collection contained determines current sensor track to be matched according to the similarity of calculating Any similarity merged between track data collection of data set and the condition that meets, for example, it is current to be matched by what is be calculated Sensor track data collection merges between each sensor track data collection that track data collection includes with any of the condition that meets Similarity is averaged, and is merged the average value as current sensor track data collection to be matched with any of the condition that meets Similarity between track data collection.
Assuming that current sensor track data collection to be matched is the sensor rail for object 1 that above-mentioned camera detects Mark data set STC-1, there are two the fusion track data collection that meet condition, respectively fusion track data collection FTAWith fusion track Data set FTB, wherein fusion track data collection FTAIn include two sensor track data collection, be respectively as follows: the first detections of radar The sensor track data collection ST for object A arrivedL1-A, and the sensor track for object A that the second detections of radar arrives Data set STL2-A;Merge track data collection FTBIn include a sensor track data collection, for third detections of radar to be directed to The sensor track data collection ST of object BL3-B;Then determine current sensor track data collection ST to be matchedC-1With meet condition Each fusion track data collection (FTAAnd FTB) between similarity include:
The first step calculates sensor track data collection STC-1With merge track data collection FTAIn sensor track data Collect STL1-ABetween similarity, be denoted as S1;
Second step calculates sensor track data collection STC-1With merge track data collection FTAIn sensor track data Collect STL2-ABetween similarity, be denoted as S2;
Third step calculates the average value of S1 and S2, which is sensor track data collection STC-1With merge track Data set FTABetween similarity;
4th step calculates sensor track data collection STC-1With merge track data collection FTBIn sensor track data Collect STL3-BBetween similarity, be denoted as S3, due to merge track data collection FTBIn only include a sensor track data collection, Therefore S3 is sensor track data collection STC-1With merge track data collection FTBBetween similarity.
Above-mentioned calculating process is summarized by formula are as follows: determine current sensor rail to be matched according to following formula Any similarity merged between track data collection of mark data set and the condition that meets:
Wherein, similar_sensor_track_m indicates current sensor track data collection to be matched and meets condition Any fusion track data collection between similarity, n indicate meet condition any fusion track data concentrate sensor The number of track data collection, SmIndicate current sensor track data collection to be matched, SiIndicate any fusion rail for meeting condition I-th of sensor track data collection in mark data set, similar (Si,Sm) indicate sensor track data collection SiWith sensor Track data collection SmBetween similarity.
Further, described to calculate separately current sensor track data collection to be matched and merged with any of the condition that meets The similarity between each sensor track data collection that track data collection includes, comprising:
Any fusion track number of characteristic and the condition that meets that current sensor track data to be matched is concentrated According to the characteristic that any sensor track data that collection includes is concentrated, registration process is carried out according to timestamp and characteristic dimension;
Calculate the sub- similarity between the characteristic of alignment;
Rail is merged with any of the condition that meets according to the current sensor track data collection to be matched of each sub- similarity calculation The similarity between any sensor track data collection that mark data set includes.
Wherein, the characteristic dimension specifically can be speed dimension or location dimension etc..
Specifically, according to following formula according to the current sensor track data collection to be matched of each sub- similarity calculation and completely The similarity between any sensor track data collection that any fusion track data collection of sufficient condition includes:
Wherein, A indicates that current sensor track data collection to be matched, α indicate any fusion track number for meeting condition According to any sensor track data collection that collection includes, similar (A, α) indicates sensor track data collection A and sensor track Similarity between data set α, each sensor track data collection include the m frame sensing data continuously exported according to same sensor The characteristic for the same object that frame determines, AiInclude in the i-th frame sensing data frame in expression sensor track data collection A Object, αiIndicate the object for including in the i-th frame sensing data frame in sensor track data collection α, similar_object (Ai, αi) indicate object AiWith object αiBetween similarity;
similar_object(Aii) obtained according to following formula:
Wherein, similar_feature (Aij,αij) indicate object AiWith object αiJth dimensional feature data between son Similarity, n indicate the dimension sum of object features data.
Further, sub- similarity (the i.e. above-mentioned object A between the characteristic for calculating alignmentiWith object αi? Sub- similarity between j dimensional feature data), comprising:
The difference between the characteristic of alignment is calculated, searches the difference pair in the posterior probability mapping table pre-established The posterior probability values are determined as the corresponding sub- similarity of characteristic of alignment by the posterior probability values answered;
Wherein, between the characteristic that the same object of different sensors output is preserved in the posterior probability mapping table Difference range and posterior probability values between mapping relations.
Difference between the characteristic of the same object of the different sensors output saved in the posterior probability mapping table Mapping relations between value range and posterior probability values are obtained by following test method:
During the test, it is assumed that the speed of object 1 is detected by the first camera and the second camera respectively, The result of detection for the first time are as follows: the first camera detects that the speed of object 1 is 45m/s, and the second camera detects object 1 Speed is 43m/s, then the speed difference for the object 1 that the two detects is 45-43=2;The result of second of detection are as follows: first shines Camera detects that the speed of object 1 is 46m/s, and the second camera detects that the speed of object 1 is 42m/s, then the two detects Object 1 speed difference be 46-42=4;Repeated detection is carried out according to above-mentioned test method, it is assumed that has carried out 100 inspections altogether It surveys, the difference of testing result falls in having 10 times between range 0-2, then the corresponding posterior probability values of difference range 0-2 are 10/ 100=0.1.Above-mentioned posterior probability mapping table can be obtained according to above-mentioned test method.
Step 230 determines sensor track number to be matched using bipartite graph matching algorithm according to determining similarity According to the incidence relation collected between corresponding object object corresponding with fusion track data collection.
Wherein, it is less than when some sensor track data collection to be matched merges the similarity between track data collection with certain Given threshold, then some described sensor track data collection to be matched with it is described certain merge track data collection and be no longer participate in and be based on Bipartite graph matching algorithm determines the corresponding object of sensor track data collection to be matched object corresponding with fusion track data collection The operation of incidence relation between body.If some sensor track data collection to be matched merges track with all of the condition that meets Similarity between data set is respectively less than given threshold, it is determined that there is no with some described sensor track data to be matched Collect the fusion track data collection with incidence relation.
The technical solution of the present embodiment, on the basis of above-described embodiment technical solution, by calculate separately currently to The sensor track data collection matched and the condition that meets it is any merge each sensor track data collection that track data collection includes it Between similarity, and current sensor track data collection and any for the condition that meets to be matched are determined according to the similarity of calculating The technological means for merging the similarity between track data collection improves current sensor track data collection to be matched and meets The computational accuracy of similarity between any fusion track data collection of condition, and then improve the phase jljl from different sensors Association accuracy and association efficiency between body.
Embodiment three
Fig. 3 is the flow diagram of the object correlating method of one of the embodiment of the present invention three multisensor, above-mentioned On the basis of embodiment technical solution, the present embodiment advanced optimizes the object correlating method, referring specifically to Fig. 3 Shown, the object correlating method includes the following steps:
Step 310, at least one sensor track data collection to be matched for being currently generated of acquisition and saved to A few fusion track data collection;Wherein, respectively fusion track data collection includes at least one existing sensor track data collection.
Wherein, each sensor track data collection includes the sensing data of the setting number continuously exported according to same sensor The characteristic for the same object that frame determines respectively, object corresponding to different sensors track data collection is different, same fusion The identical and corresponding sensor of object corresponding to each sensor track data collection that track data collection includes is different, and Object corresponding to difference fusion track data collection is different.
Step 320, determine sensor track data collection to be matched with merge the similarity between track data collection.
Step 330, according to similarity definitive result, determine the corresponding object of sensor track data collection to be matched and melt Close the incidence relation between the corresponding object of track data collection.
The sensor track data collection tool to be matched with this is added in sensor track data collection to be matched by step 340 Relevant fusion track data is concentrated;Alternatively,
There is no there is when merging track data collection of incidence relation with sensor track data collection to be matched, it is based on Sensor track data collection to be matched generates new fusion track data collection.
Assuming that sensor track data to be matched integrates the sensor track data collection about object 1 obtained as camera STC-1, integrate with its track data that merges with incidence relation as FTA, FTAIn comprising the first detections of radar arrive for object A Sensor track data collection STL1-A, and the sensor track data collection ST for object A that the second detections of radar arrivesL2-A, then By sensor track data collection STC-1Fusion track data collection FT is addedAIn, at this point, fusion track data collection FTAIt is passed including three Sensor track data collection, is respectively as follows: STC-1、STL1-AAnd STL2-A, it is same according to object association results object 1 and object A Object.
Step 350, when getting new sensing data frame, using the new sensing data frame to corresponding fusion track Data set is updated.
Wherein, the new sensing data frame is that (can also be other settings based on 5 frames that certain sensor continuously exports Number, the present embodiment are illustrated by taking 5 frames as an example) after the obtained sensor track data collection of sensing data frame, certain sensing The sensing data frame that device is got again is the corresponding new sensing data frame of certain sensor.For example, in the example above, radar In 5s, 5 frame sensing data frames of continuous output are gathered enough, have obtained the sensor track number of object A respectively at this time According to collection STL-AAnd the sensor track data collection ST of object BL-B;And the sensing data frame that camera continuously exports just gathers in 6s Enough 5 frames, therefore the sensor track data collection ST of object A that radar obtainsL-AWith the sensor track data collection ST of object BL-BPoint The two fusion track data collection saved are not constituted, are denoted as fusion track data collection FT respectivelyAAnd FTB, based on camera the The sensor track data collection ST for the object 1 that the 5 frame sensing data frames continuously exported gathered enough when 6s obtainC-1And object 2 Sensor track data integrates as STC-2For two sensor track data collection to be matched.Meanwhile in 6s, the radar is again A frame sensing data frame can be exported, which is above-mentioned new sensing data frame, then utilizes the new sensing data Frame is to the fusion track data collection FTAAnd FTBIt is updated.
Specifically: when getting new sensing data frame, the feature of object is determined according to the new sensing data frame Data.For each fusion track data collection saved, judge the object determined whether with present fusion track data collection Whether sensor corresponding to the identical and new sensing data frame of corresponding object is right with present fusion track data collection institute The sensor answered is identical, if so, being updated according to the characteristic for the object determined to present fusion track data collection.
Further, the characteristic for the object that the basis is determined is updated present fusion track data collection, Include:
Using the characteristic for the object determined, present fusion track data is concentrated and is passed with corresponding to this feature data The corresponding sensor track data of sensor is concentrated, and the characteristic farthest apart from current time is replaced.
For example, the characteristic of the object A determined using the sensing data frame exported based on the radar 6s, replacement Sensor track data collection STL-ADescribed in object A in the sensing data frame that is exported in 1s of radar the characteristic According to.The characteristic can specifically refer to speed or the position etc. of object.
Step 360 determines whether updated fusion track data collection is effective, if effectively, thening follow the steps 370, retaining Updated fusion track data collection, if in vain, thening follow the steps 380, being removed to updated fusion track data collection Processing, so that fusion track data collection of removing that treated is effective or updated fusion track data collection is deleted.
Further, whether the updated fusion track data collection of the determination is effective, comprising:
If it includes a sensor track data collection that updated fusion track data, which is concentrated, it is determined that updated fusion Track data concentrates the final updating time for the sensor track data collection for including, and is determined according to the final updating time updated Whether effective merge track data collection;If final updating time gap current time is farther out, it is determined that updated fusion track Data set is invalid.
If it includes multiple sensor track data collection that updated fusion track data, which is concentrated, by updated fusion rail Each sensor track data collection in mark data set inputs the neural network model trained in advance, obtains the neural network mould The updated fusion track data of type output concentrates each sensor track data collection correspond to the probability value of same object, if this is generally Rate value is greater than given threshold, it is determined that updated fusion track data collection is effective, if the probability value is less than given threshold, counts Total similarity that updated fusion track data concentrates each sensor track data collection is calculated, is determined according to total similarity after updating Fusion track data collection it is whether effective.
Wherein, the neural network model trained in advance carries out learning training based on the training data of setting quantity, Every group of training data includes at least two sensor track data collection, and each sensor track data collection carries object mark Note, the sensor track data collection of corresponding same object carry identical object marker, the sensor rail of corresponding different objects Mark data set carries different object markers.Neural network model is in learning training based on known corresponding same object The sensor track data collection of sensor track data collection or known corresponding different objects carries out feature learning, each to remember The feature of sensor track data collection reaches and identifies which sensor track data collection corresponds to identical object, which sensor Track data collection corresponds to different objects, and can be calculated often by counting the object marker that each sensor track data collection carries At least two sensor track data collection that group training data includes correspond to the probability of same object.Specifically, according to following public affairs Formula calculates total similarity that updated fusion track data concentrates each sensor track data collection:
Wherein, similarity indicates total similarity, and n indicates updated fusion track data centralized sensors track number According to the number of collection,For number of combinations formula, any 2 sensor track datas for concentrating n sensor track data are indicated Collect the combination sum being combined, SiIndicate that updated fusion track data concentrates i-th of sensor track data collection, SjTable Show that updated fusion track data concentrates j-th of sensor track data collection, simliar (Si,Sj) indicate sensor track number According to collection SiWith sensor track data collection SjBetween similarity.
It is further, described that dismounting processing is carried out to updated fusion track data collection, comprising:
If it includes a sensor track data collection that updated fusion track data, which is concentrated, updated fusion is deleted Track data collection;
If it includes two sensor track data collection that updated fusion track data, which is concentrated, according to setting redundant rule elimination The sensor track data collection that updated fusion track data is concentrated, and continue judgement and delete sensor track data Whether the fusion track data collection of collection operates effectively;The setting rule, which for example can be, deletes the minimum sensor of priority Corresponding sensor track data collection or renewal time are apart from the farther away sensor track data collection of current time;
If it includes at least three sensor track data collection that updated fusion track data, which is concentrated, calculate separately at least Three sensor track datas are concentrated total similar between each sensor track data collection and other sensors track data collection The fusion track data of the minimum corresponding sensor track data collection of total similarity in the updated is concentrated and is deleted, and pressed by degree Continue to judge whether the fusion track data collection for deleting sensor track data collection operates effectively according to the above method.
Specifically, calculating at least three sensor track datas according to following formula concentrates each sensor track data collection With total similarity between other sensors track data collection:
Wherein, similar_sensor_track_m indicates total similarity, and n indicates that updated fusion track data is concentrated The number for the sensor track data collection for including, SiIndicate i-th of sensor track that updated fusion track data is concentrated Data set, SmIndicate current sensor track data collection to be calculated, similar (Si,Sm) indicate sensor track data collection Si With sensor track data collection SmBetween similarity, i ≠ m.
Step 370 retains updated fusion track data collection.
Step 380 carries out dismounting processing to updated fusion track data collection, so as to remove, treated merges track Data set is effective or updated fusion track data collection is deleted.
On the basis of above-described embodiment technical solution, the technical solution of the present embodiment is getting new sensing data When frame, corresponding fusion track data collection is updated using the new sensing data frame, and further ensures that updated The validity for merging track data collection, improves association accuracy and the association between the same object from different sensors Efficiency.
Example IV
Fig. 4 is the structural schematic diagram of the object associated apparatus of one of the embodiment of the present invention four multisensor, referring to fig. 4 Shown, the object associated apparatus includes: to obtain module 410, similarity determining module 420 and incidence relation determining module 430;
Wherein, obtain module 410, for obtain at least one the sensor track data collection to be matched being currently generated with And at least one fusion track data collection saved;Wherein, respectively fusion track data collection includes at least one existing sensing Device track data collection;
Similarity determining module 420, for determine sensor track data collection to be matched with merge track data collection it Between similarity;
Incidence relation determining module 430, for determining sensor track data to be matched according to similarity definitive result Collect the incidence relation between corresponding object object corresponding with track data collection is merged;
Wherein, each sensor track data collection includes the sensing data of the setting number continuously exported according to same sensor The characteristic for the same object that frame determines respectively, object corresponding to different sensors track data collection is different, same fusion The identical and corresponding sensor of object corresponding to each sensor track data collection that track data collection includes is different, and Object corresponding to difference fusion track data collection is different.
Further, similarity determining module 420 is specifically used for: for each sensor track data collection to be matched, Current sensor track data collection to be matched and each similarity merged between track data collection for the condition that meets are determined respectively; Sensor corresponding to each sensor track data collection that the fusion track data collection for meeting condition includes with currently to Sensor corresponding to the sensor track data collection matched is different;
Corresponding, incidence relation determining module 430 is specifically used for: according to determining similarity, being calculated using bipartite graph matching Method determines the association between the corresponding object of sensor track data collection to be matched object corresponding with fusion track data collection Relationship.
Further, similarity determining module 420 is specifically used for: calculating separately current sensor track data to be matched Collect any similarity merged between each sensor track data collection that track data collection includes with the condition that meets, according to calculating Similarity determine that current sensor track data collection to be matched merges between track data collection with any of the condition that meets Similarity.
Further, similarity determining module 420 is specifically used for: current sensor track data to be matched is concentrated The characteristic that any sensor track data that characteristic and any fusion track data collection for meeting condition include is concentrated, Registration process is carried out according to timestamp and characteristic dimension;
Calculate the sub- similarity between the characteristic of alignment;
Rail is merged with any of the condition that meets according to the current sensor track data collection to be matched of each sub- similarity calculation The similarity between any sensor track data collection that mark data set includes.
Further, similarity determining module 420 is specifically used for: the difference between the characteristic of alignment is calculated, pre- The corresponding posterior probability values of the difference are searched in the posterior probability mapping table first established, which is determined as alignment The corresponding sub- similarity of characteristic;
Wherein, between the characteristic that the same object of different sensors output is preserved in the posterior probability mapping table Difference range and posterior probability values between mapping relations.
Further, described device further include: processing module, for determining sensor track data collection pair to be matched After incidence relation between the object answered object corresponding with fusion track data collection, by sensor track data to be matched Collection, which is added, with the sensor track data collection to be matched there is the track data that merges of incidence relation to concentrate;Alternatively, not depositing There is when merging track data collection of incidence relation with sensor track data collection to be matched, based on sensor to be matched Track data collection generates new fusion track data collection.
Further, described device further include: update module, for when getting new sensing data frame, according to this New sensing data frame determines the characteristic of object;For the fusion track data collection of each preservation, judge to determine Object whether sensor corresponding to the sensing data frame identical and new as object corresponding to present fusion track data collection It is whether identical as sensor corresponding to present fusion track data collection, if so, according to the characteristic for the object determined Present fusion track data collection is updated.
Further, the update module is specifically used for: using the characteristic for the object determined, by present fusion rail Sensor track data corresponding with sensor corresponding to this feature data is concentrated in mark data set, farthest apart from current time Characteristic is replaced.
Further, described device further includes validity determining module, for carrying out to present fusion track data collection After update, determine whether updated fusion track data collection is effective, if effectively, retaining updated fusion track data Collection, if in vain, carrying out dismounting processing to updated fusion track data collection, so as to remove, treated merges track data Collect effective or updated fusion track data collection to be deleted.
Further, the validity determining module is specifically used for:
If it includes a sensor track data collection that updated fusion track data, which is concentrated, it is determined that updated fusion Track data concentrates the final updating time for the sensor track data collection for including, and is determined according to the final updating time updated Whether effective merge track data collection;
If it includes multiple sensor track data collection that updated fusion track data, which is concentrated, by updated fusion rail Each sensor track data collection in mark data set inputs the neural network model trained in advance, obtains the neural network mould The updated fusion track data of type output concentrates each sensor track data collection correspond to the probability value of same object, if this is generally Rate value is greater than given threshold, it is determined that updated fusion track data collection is effective, if the probability value is less than given threshold, counts Total similarity that updated fusion track data concentrates each sensor track data collection is calculated, is determined according to total similarity after updating Fusion track data collection it is whether effective.
Further, the validity determining module includes removing unit, and the dismounting unit is specifically used for:
If it includes a sensor track data collection that updated fusion track data, which is concentrated, updated fusion is deleted Track data collection;
If it includes two sensor track data collection that updated fusion track data, which is concentrated, according to setting redundant rule elimination The sensor track data collection that updated fusion track data is concentrated, and continue judgement and delete sensor track data Whether the fusion track data collection of collection operates effectively;
If it includes at least three sensor track data collection that updated fusion track data, which is concentrated, calculate separately at least Three sensor track datas are concentrated total similar between each sensor track data collection and other sensors track data collection The fusion track data of the minimum corresponding sensor track data collection of total similarity in the updated is concentrated and is deleted by degree, and after It is continuous to judge whether the fusion track data collection for deleting sensor track data collection operates effectively.
The technical solution of the present embodiment, improve association accuracy between the same object from different sensors and It is associated with efficiency.
The object associated apparatus of multisensor provided by the embodiment of the present invention can be performed any embodiment of that present invention and be mentioned The object correlating method of the multisensor of confession has the corresponding functional module of execution method and beneficial effect.
Embodiment five
Fig. 5 is the structural schematic diagram for a kind of electronic equipment that the embodiment of the present invention five provides.Fig. 5, which is shown, to be suitable for being used in fact The block diagram of the example devices 12 of existing embodiment of the present invention.The equipment 12 that Fig. 5 is shown is only an example, should not be to this hair The function and use scope of bright embodiment bring any restrictions.
As shown in figure 5, equipment 12 is showed in the form of universal computing device.The component of equipment 12 may include but unlimited In one or more processor or processing unit 16, system storage 28, connecting different system components, (including system is deposited Reservoir 28 and processing unit 16) bus 18.
Bus 18 indicates one of a few class bus structures or a variety of, including memory bus or Memory Controller, Peripheral bus, graphics acceleration port, processor or the local bus using any bus structures in a variety of bus structures.It lifts For example, these architectures include but is not limited to industry standard architecture (ISA) bus, microchannel architecture (MAC) Bus, enhanced isa bus, Video Electronics Standards Association (VESA) local bus and peripheral component interconnection (PCI) bus.
Equipment 12 typically comprises a variety of computer system readable media.These media can be it is any can be by equipment 12 The usable medium of access, including volatile and non-volatile media, moveable and immovable medium.
System storage 28 may include the computer system readable media of form of volatile memory, such as arbitrary access Memory (RAM) 30 and/or cache memory 32.Equipment 12 may further include it is other it is removable/nonremovable, Volatile/non-volatile computer system storage medium.Only as an example, storage system 34 can be used for reading and writing irremovable , non-volatile magnetic media (Fig. 5 do not show, commonly referred to as " hard disk drive ").Although being not shown in Fig. 5, use can be provided In the disc driver read and write to removable non-volatile magnetic disk (such as " floppy disk "), and to removable anonvolatile optical disk The CD drive of (such as CD-ROM, DVD-ROM or other optical mediums) read-write.In these cases, each driver can To be connected by one or more data media interfaces with bus 18.Memory 28 may include at least one program product, The program product has one group of (such as obtaining module 410, similarity determining module 420 and incidence relation determining module 430) journey Sequence module, these program modules are configured to perform the function of various embodiments of the present invention.
With one group of (obtaining module 410, similarity determining module 420 and incidence relation determining module 430) program module 42 program/utility 40, can store in such as memory 28, and such program module 42 includes but is not limited to operate System, one or more application program, other program modules and program data, each of these examples or certain group It may include the realization of network environment in conjunction.Program module 42 usually execute function in embodiment described in the invention and/ Or method.
Equipment 12 can also be communicated with one or more external equipments 14 (such as keyboard, sensing equipment, display 24 etc.), Can also be enabled a user to one or more equipment interacted with the equipment 12 communication, and/or with enable the equipment 12 with One or more of the other any equipment (such as network interface card, modem etc.) communication for calculating equipment and being communicated.It is this logical Letter can be carried out by input/output (I/O) interface 22.Also, equipment 12 can also by network adapter 20 and one or The multiple networks of person (such as local area network (LAN), wide area network (WAN) and/or public network, such as internet) communication.As shown, Network adapter 20 is communicated by bus 18 with other modules of equipment 12.It should be understood that although not shown in the drawings, can combine Equipment 12 use other hardware and/or software module, including but not limited to: microcode, device driver, redundant processing unit, External disk drive array, RAID system, tape drive and data backup storage system etc..
Processing unit 16 by the program that is stored in system storage 28 of operation, thereby executing various function application and Data processing, such as realize the object correlating method of multisensor provided by the embodiment of the present invention.
Embodiment six
The embodiment of the present invention six additionally provides a kind of computer readable storage medium, is stored thereon with computer program, should The object correlating method of the multisensor as described in any embodiment of the present invention is realized when program is executed by processor.
The computer storage medium of the embodiment of the present invention, can be using any of one or more computer-readable media Combination.Computer-readable medium can be computer-readable signal media or computer readable storage medium.It is computer-readable Storage medium for example may be-but not limited to-the system of electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor, device or Device, or any above combination.The more specific example (non exhaustive list) of computer readable storage medium includes: tool There are electrical connection, the portable computer diskette, hard disk, random access memory (RAM), read-only memory of one or more conducting wires (ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD- ROM), light storage device, magnetic memory device or above-mentioned any appropriate combination.In this document, computer-readable storage Medium can be any tangible medium for including or store program, which can be commanded execution system, device or device Using or it is in connection.
Computer-readable signal media may include in a base band or as carrier wave a part propagate data-signal, Wherein carry computer-readable program code.The data-signal of this propagation can take various forms, including but unlimited In electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be that computer can Any computer-readable medium other than storage medium is read, which can send, propagates or transmit and be used for By the use of instruction execution system, device or device or program in connection.
The program code for including on computer-readable medium can transmit with any suitable medium, including --- but it is unlimited In wireless, electric wire, optical cable, RF etc. or above-mentioned any appropriate combination.
The computer for executing operation of the present invention can be write with one or more programming languages or combinations thereof Program code, described program design language include object oriented program language-such as Java, Smalltalk, C++, It further include conventional procedural programming language-such as " C " language or similar programming language.Program code can be with It fully executes, partly execute on the user computer on the user computer, being executed as an independent software package, portion Divide and partially executes or executed on a remote computer or server completely on the remote computer on the user computer.? Be related in the situation of remote computer, remote computer can pass through the network of any kind --- including local area network (LAN) or Wide area network (WAN)-be connected to subscriber computer, or, it may be connected to outer computer (such as mentioned using Internet service It is connected for quotient by internet).

Claims (14)

1. a kind of object correlating method of multisensor characterized by comprising
At least one the fusion rail for obtaining at least one the sensor track data collection to be matched being currently generated and having saved Mark data set;Wherein, respectively fusion track data collection includes at least one existing sensor track data collection;
Determine sensor track data collection to be matched with merge the similarity between track data collection;
According to similarity definitive result, determine the corresponding object of sensor track data collection to be matched with merge track data collection Incidence relation between corresponding object;
Wherein, each sensor track data collection includes the sensing data frame point of the setting number continuously exported according to same sensor Not Que Ding same object characteristic, object corresponding to different sensors track data collection is different, same fusion track The identical and corresponding sensor of object corresponding to each sensor track data collection that data set includes is different, and different It is different to merge object corresponding to track data collection.
2. object correlating method according to claim 1, which is characterized in that determination sensor track number to be matched Include: with the similarity merged between track data collection according to collection
For each sensor track data collection to be matched, current sensor track data collection to be matched is determined respectively and is expired Similarity between each fusion track data collection of sufficient condition;Each sensing that the fusion track data collection for meeting condition includes Sensor corresponding to device track data collection is different from sensor corresponding to current sensor track data collection to be matched;
It is corresponding, it is described according to similarity definitive result, it determines the corresponding object of sensor track data collection to be matched and melts Close the incidence relation between the corresponding object of track data collection, comprising:
The corresponding object of sensor track data collection to be matched is determined using bipartite graph matching algorithm according to determining similarity Incidence relation between body object corresponding with fusion track data collection.
3. object correlating method according to claim 2, which is characterized in that the current sensor rail to be matched of the determination Each similarity merged between track data collection of mark data set and the condition that meets, comprising:
Calculate separately current sensor track data collection to be matched includes with any track data collection that merges for the condition that meets Similarity between each sensor track data collection determines current sensor track data to be matched according to the similarity of calculating Collect any similarity merged between track data collection with the condition that meets.
4. object correlating method according to claim 3, which is characterized in that described to calculate separately current sensing to be matched Device track data collection and the condition that meets it is any merge it is similar between each sensor track data collection that track data collection includes Degree, comprising:
Any fusion track data collection of characteristic and the condition that meets that current sensor track data to be matched is concentrated The characteristic that any sensor track data for including is concentrated carries out registration process according to timestamp and characteristic dimension;
Calculate the sub- similarity between the characteristic of alignment;
Track number is merged with any of the condition that meets according to the current sensor track data collection to be matched of each sub- similarity calculation The similarity between any sensor track data collection for including according to collection.
5. object correlating method according to claim 4, which is characterized in that between the characteristic for calculating alignment Sub- similarity, comprising:
The difference between the characteristic of alignment is calculated, it is corresponding to search the difference in the posterior probability mapping table pre-established The posterior probability values are determined as the corresponding sub- similarity of characteristic of alignment by posterior probability values;
Wherein, the difference between the characteristic of the same object of different sensors output is preserved in the posterior probability mapping table The mapping relations being worth between range and posterior probability values.
6. object correlating method according to claim 1, which is characterized in that determine sensor track data collection to be matched After incidence relation between corresponding object object corresponding with fusion track data collection, the method also includes:
Sensor track data collection to be matched, which is added, has incidence relation with the sensor track data collection to be matched Track data is merged to concentrate;Alternatively,
There is no with sensor track data collection to be matched have when merging track data collection of incidence relation, based on to The sensor track data collection matched generates new fusion track data collection.
7. object correlating method according to claim 1, which is characterized in that further include:
When getting new sensing data frame, the characteristic of object is determined according to the new sensing data frame;
For the fusion track data collection of each preservation, judge whether the object determined is right with present fusion track data collection institute Sensor corresponding to the identical and new sensing data frame of the object answered whether with corresponding to present fusion track data collection Sensor is identical, if so, being updated according to the characteristic for the object determined to present fusion track data collection.
8. object correlating method according to claim 7, which is characterized in that the characteristic for the object that the basis is determined It is updated according to present fusion track data collection, comprising:
Using the characteristic for the object determined, present fusion track data is concentrated and sensor corresponding to this feature data Corresponding sensor track data is concentrated, and the characteristic farthest apart from current time is replaced.
9. object correlating method according to claim 7, which is characterized in that carried out more to present fusion track data collection After new, further includes:
Determine whether updated fusion track data collection is effective, if effectively, retaining updated fusion track data collection, if In vain, then dismounting processing is carried out to updated fusion track data collection so that remove treated fusion track data collection have Effect or updated fusion track data collection are deleted.
10. object correlating method according to claim 9, which is characterized in that the updated fusion track number of determination It is whether effective according to collecting, comprising:
If it includes a sensor track data collection that updated fusion track data, which is concentrated, it is determined that updated fusion track The final updating time for the sensor track data collection for including in data set determines updated fusion according to the final updating time Whether track data collection is effective;
If it includes multiple sensor track data collection that updated fusion track data, which is concentrated, by updated fusion track number The neural network model trained in advance is inputted according to each sensor track data collection of concentration, it is defeated to obtain the neural network model Updated fusion track data out concentrates each sensor track data collection to correspond to the probability value of same object, if the probability value Greater than given threshold, it is determined that updated fusion track data collection is effective, if the probability value is less than given threshold, calculates more Fusion track data after new concentrates total similarity of each sensor track data collection, determines updated melt according to total similarity Whether effective close track data collection.
11. object correlating method according to claim 9, which is characterized in that described to updated fusion track data Collection carries out dismounting processing, comprising:
If it includes a sensor track data collection that updated fusion track data, which is concentrated, updated fusion track is deleted Data set;
If it includes two sensor track data collection that updated fusion track data, which is concentrated, updated according to setting redundant rule elimination The sensor track data collection that fusion track data afterwards is concentrated, and continue judgement and delete sensor track data collection Whether fusion track data collection operates effectively;
If it includes at least three sensor track data collection that updated fusion track data, which is concentrated, at least three are calculated separately Sensor track data concentrates total similarity between each sensor track data collection and other sensors track data collection, will The fusion track data of the corresponding sensor track data collection of minimum total similarity in the updated, which is concentrated, to be deleted, and continues to judge Whether the fusion track data collection for deleting sensor track data collection operates effectively.
12. a kind of object associated apparatus of multisensor characterized by comprising
Obtain module, for obtain at least one the sensor track data collection to be matched being currently generated and saved to A few fusion track data collection;Wherein, respectively fusion track data collection includes at least one existing sensor track data collection;
Similarity determining module, for determine sensor track data collection to be matched with merge it is similar between track data collection Degree;
Incidence relation determining module, for determining that sensor track data collection to be matched is corresponding according to similarity definitive result Object and the corresponding object of fusion track data collection between incidence relation;
Wherein, each sensor track data collection includes the sensing data frame point of the setting number continuously exported according to same sensor Not Que Ding same object characteristic, object corresponding to different sensors track data collection is different, same fusion track The identical and corresponding sensor of object corresponding to each sensor track data collection that data set includes is different, and different It is different to merge object corresponding to track data collection.
13. a kind of electronic equipment, which is characterized in that the electronic equipment includes:
One or more processors;
Storage device, for storing one or more programs,
When one or more of programs are executed by one or more of processors, so that one or more of processors are real The now object correlating method of the multisensor as described in any in claim 1-11.
14. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the program is by processor The object correlating method of the multisensor as described in any in claim 1-11 is realized when execution.
CN201811481180.4A 2018-12-05 2018-12-05 A kind of object correlating method, device, equipment and the medium of multisensor Pending CN109583505A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811481180.4A CN109583505A (en) 2018-12-05 2018-12-05 A kind of object correlating method, device, equipment and the medium of multisensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811481180.4A CN109583505A (en) 2018-12-05 2018-12-05 A kind of object correlating method, device, equipment and the medium of multisensor

Publications (1)

Publication Number Publication Date
CN109583505A true CN109583505A (en) 2019-04-05

Family

ID=65926047

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811481180.4A Pending CN109583505A (en) 2018-12-05 2018-12-05 A kind of object correlating method, device, equipment and the medium of multisensor

Country Status (1)

Country Link
CN (1) CN109583505A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110134126A (en) * 2019-05-15 2019-08-16 百度在线网络技术(北京)有限公司 Path matching method, apparatus, equipment and medium
CN110132290A (en) * 2019-05-20 2019-08-16 北京百度网讯科技有限公司 Perception information method for amalgamation processing, device, equipment and storage medium
CN110703732A (en) * 2019-10-21 2020-01-17 北京百度网讯科技有限公司 Correlation detection method, device, equipment and computer readable storage medium
CN111651437A (en) * 2020-04-17 2020-09-11 北京嘀嘀无限科技发展有限公司 Data cleaning method and device, electronic equipment and storage medium
CN112904331A (en) * 2019-11-19 2021-06-04 杭州海康威视数字技术股份有限公司 Method, device and equipment for determining movement track and storage medium
CN113095345A (en) * 2020-01-08 2021-07-09 富士通株式会社 Data matching method and device and data processing equipment
CN113298141A (en) * 2021-05-24 2021-08-24 北京环境特性研究所 Detection method and device based on multi-source information fusion and storage medium
CN113821873A (en) * 2021-08-31 2021-12-21 重庆长安汽车股份有限公司 Target association verification method for automatic driving and storage medium
WO2024007972A1 (en) * 2022-07-05 2024-01-11 安徽蔚来智驾科技有限公司 Object association method, computer device, computer readable storage medium, and vehicle

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101839973A (en) * 2010-04-07 2010-09-22 中国人民解放军理工大学 Track correlation method by taking topological sequences as characteristics
CN102609323A (en) * 2012-02-22 2012-07-25 康佳集团股份有限公司 Method for adjusting sensitivity of trackball
CN103150156A (en) * 2012-12-06 2013-06-12 江苏省公用信息有限公司 Method and system, based on geographic model and moving track, for obtaining characteristic crowd in real time
US20140056488A1 (en) * 2006-08-10 2014-02-27 Louisiana Tech University Foundation, Inc. Method and apparatus to relate biometric samples to target far and frr with predetermined confidence levels
CN103955947A (en) * 2014-03-21 2014-07-30 南京邮电大学 Multi-target association tracking method based on continuous maximum energy and apparent model
CN104915970A (en) * 2015-06-12 2015-09-16 南京邮电大学 Multi-target tracking method based on track association
CN105678804A (en) * 2016-01-06 2016-06-15 北京理工大学 Real-time on-line multi-target tracking method by coupling target detection and data association
CN106022239A (en) * 2016-05-13 2016-10-12 电子科技大学 Multi-target tracking method based on recurrent neural network
CN107102295A (en) * 2017-04-13 2017-08-29 杭州电子科技大学 The multisensor TDOA passive location methods filtered based on GLMB
CN108762245A (en) * 2018-03-20 2018-11-06 华为技术有限公司 Data fusion method and relevant device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140056488A1 (en) * 2006-08-10 2014-02-27 Louisiana Tech University Foundation, Inc. Method and apparatus to relate biometric samples to target far and frr with predetermined confidence levels
CN101839973A (en) * 2010-04-07 2010-09-22 中国人民解放军理工大学 Track correlation method by taking topological sequences as characteristics
CN102609323A (en) * 2012-02-22 2012-07-25 康佳集团股份有限公司 Method for adjusting sensitivity of trackball
CN103150156A (en) * 2012-12-06 2013-06-12 江苏省公用信息有限公司 Method and system, based on geographic model and moving track, for obtaining characteristic crowd in real time
CN103955947A (en) * 2014-03-21 2014-07-30 南京邮电大学 Multi-target association tracking method based on continuous maximum energy and apparent model
CN104915970A (en) * 2015-06-12 2015-09-16 南京邮电大学 Multi-target tracking method based on track association
CN105678804A (en) * 2016-01-06 2016-06-15 北京理工大学 Real-time on-line multi-target tracking method by coupling target detection and data association
CN106022239A (en) * 2016-05-13 2016-10-12 电子科技大学 Multi-target tracking method based on recurrent neural network
CN107102295A (en) * 2017-04-13 2017-08-29 杭州电子科技大学 The multisensor TDOA passive location methods filtered based on GLMB
CN108762245A (en) * 2018-03-20 2018-11-06 华为技术有限公司 Data fusion method and relevant device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
冯林等: "《图论及应用》", 31 March 2012 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114879695A (en) * 2019-05-15 2022-08-09 百度在线网络技术(北京)有限公司 Track matching method, device, equipment and medium
CN110134126A (en) * 2019-05-15 2019-08-16 百度在线网络技术(北京)有限公司 Path matching method, apparatus, equipment and medium
CN110134126B (en) * 2019-05-15 2022-08-23 百度在线网络技术(北京)有限公司 Track matching method, device, equipment and medium
CN110132290A (en) * 2019-05-20 2019-08-16 北京百度网讯科技有限公司 Perception information method for amalgamation processing, device, equipment and storage medium
CN110703732A (en) * 2019-10-21 2020-01-17 北京百度网讯科技有限公司 Correlation detection method, device, equipment and computer readable storage medium
CN112904331A (en) * 2019-11-19 2021-06-04 杭州海康威视数字技术股份有限公司 Method, device and equipment for determining movement track and storage medium
CN112904331B (en) * 2019-11-19 2024-05-07 杭州海康威视数字技术股份有限公司 Method, device, equipment and storage medium for determining moving track
CN113095345A (en) * 2020-01-08 2021-07-09 富士通株式会社 Data matching method and device and data processing equipment
CN111651437A (en) * 2020-04-17 2020-09-11 北京嘀嘀无限科技发展有限公司 Data cleaning method and device, electronic equipment and storage medium
CN113298141B (en) * 2021-05-24 2023-09-15 北京环境特性研究所 Detection method, device and storage medium based on multi-source information fusion
CN113298141A (en) * 2021-05-24 2021-08-24 北京环境特性研究所 Detection method and device based on multi-source information fusion and storage medium
CN113821873A (en) * 2021-08-31 2021-12-21 重庆长安汽车股份有限公司 Target association verification method for automatic driving and storage medium
WO2024007972A1 (en) * 2022-07-05 2024-01-11 安徽蔚来智驾科技有限公司 Object association method, computer device, computer readable storage medium, and vehicle

Similar Documents

Publication Publication Date Title
CN109583505A (en) A kind of object correlating method, device, equipment and the medium of multisensor
González et al. Automatic traffic signs and panels inspection system using computer vision
KR102205096B1 (en) Transaction risk detection method and apparatus
CN105444770B (en) Track level map generation and localization method based on smart mobile phone
CN110210302A (en) Multi-object tracking method, device, computer equipment and storage medium
Yin et al. Dynam-SLAM: An accurate, robust stereo visual-inertial SLAM method in dynamic environments
CN107687850A (en) A kind of unmanned vehicle position and orientation estimation method of view-based access control model and Inertial Measurement Unit
CN108090423A (en) A kind of depth detection method of license plate returned based on thermodynamic chart and key point
US10645668B2 (en) Indoor positioning system and method based on geomagnetic signals in combination with computer vision
US20220058818A1 (en) Object-centric three-dimensional auto labeling of point cloud data
Bertoni et al. Perceiving humans: from monocular 3d localization to social distancing
CA3083430C (en) Urban environment labelling
CN110008789A (en) Multiclass object detection and knowledge method for distinguishing, equipment and computer readable storage medium
CN110389995A (en) Lane information detection method, device, equipment and medium
CN111160291A (en) Human eye detection method based on depth information and CNN
CN113762044A (en) Road recognition method, road recognition device, computer equipment and storage medium
CN112200956B (en) Access control method, system, electronic device and storage medium
Li et al. Visual slam in dynamic scenes based on object tracking and static points detection
CN116310799A (en) Dynamic feature point eliminating method combining semantic information and geometric constraint
US9301722B1 (en) Guiding computational perception through a shared auditory space
CN101964054A (en) Friendly track detection system based on visual processing
Zhang et al. Opensight: A simple open-vocabulary framework for lidar-based object detection
CN117593792A (en) Abnormal gesture detection method and device based on video frame
Lin et al. DPL-SLAM: Enhancing Dynamic Point-Line SLAM through Dense Semantic Methods
Zhao et al. A multi-sensor fusion system for improving indoor mobility of the visually impaired

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190405

RJ01 Rejection of invention patent application after publication