CN113408625B - Multi-source heterogeneous data single-frame fusion and consistent characterization method applied to unmanned system - Google Patents

Multi-source heterogeneous data single-frame fusion and consistent characterization method applied to unmanned system Download PDF

Info

Publication number
CN113408625B
CN113408625B CN202110690846.2A CN202110690846A CN113408625B CN 113408625 B CN113408625 B CN 113408625B CN 202110690846 A CN202110690846 A CN 202110690846A CN 113408625 B CN113408625 B CN 113408625B
Authority
CN
China
Prior art keywords
data
fusion
source
dimension
heterogeneous
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110690846.2A
Other languages
Chinese (zh)
Other versions
CN113408625A (en
Inventor
龙宁波
谢天
朱世强
李月华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lab
Original Assignee
Zhejiang Lab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lab filed Critical Zhejiang Lab
Priority to CN202110690846.2A priority Critical patent/CN113408625B/en
Publication of CN113408625A publication Critical patent/CN113408625A/en
Application granted granted Critical
Publication of CN113408625B publication Critical patent/CN113408625B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/10Pre-processing; Data cleansing

Abstract

The invention discloses a multi-source heterogeneous data single-frame fusion and consistency characterization method applied to an unmanned system. The method can complete the fusion of single-frame heterogeneous data generated by a plurality of sensors and the consistent representation of the environment, and realize the comprehensive perception and description of the environment. The method can effectively solve the problems of effective fusion of multi-source heterogeneous data generated by various sensors and consistent representation of detection results.

Description

Multi-source heterogeneous data single-frame fusion and consistent characterization method applied to unmanned system
Technical Field
The invention relates to the technical field of sensor data fusion, in particular to a multi-source heterogeneous data single-frame fusion and consistent characterization method applied to an unmanned system.
Background
The unmanned system completes data fusion and consistent representation of the environment by utilizing heterogeneous data generated by various sensors, including but not limited to 2D image data, 3D point cloud data, inertial navigation data, astronomical data, temperature data, mechanical data and other multi-source heterogeneous data, and can utilize the data generated by various sensors to the maximum extent to realize comprehensive perception and description of the environment. The traditional method mainly focuses on feature level and decision level fusion, and the method for performing data level fusion among a large amount of heterogeneous data is less, particularly the method for performing single-frame-oriented data level fusion can provide more comprehensive and high-quality perception information for an unmanned system.
Disclosure of Invention
The invention aims to provide a single-frame fusion and consistent characterization method of multi-source heterogeneous data applied to an unmanned system, which is used for realizing the fusion of heterogeneous data of different sensors, completing the consistent characterization of an environment and solving the problem that the multi-source heterogeneous data is difficult to perform data-level fusion.
In order to achieve the purpose, the technical scheme of the invention is as follows:
a multi-source heterogeneous data single-frame fusion and consistent characterization method applied to an unmanned system comprises the following steps:
the method comprises the following steps: data acquisition and pretreatment;
carrying out simultaneous data acquisition by using M multi-source sensors, and preprocessing the obtained M data blocks to ensure that the M data blocks all have the description of the three-dimensional spatial position attribute, and the three-dimensional spatial position attributes share the same three-dimensional spatial coordinate system;
step two: fusing data of the same type;
classifying the preprocessed M data blocks according to the sensor types, classifying the data blocks with the same sensor principle or representing the same physical property into the same type of data, and performing data level fusion on the same type of data to obtain N isomorphic fused data groups, wherein N is the heterogeneous type quantity of heterogeneous data;
step three: dimension alignment of the fused data sets;
carrying out dimension statistics on the N fusion data groups, and constructing an X-dimension data expression model, wherein X is the minimum value capable of covering all different dimensions of each fusion data group; mapping the N fusion data sets to the data expression model, and outputting N data sets with aligned dimensions;
step four: heterogeneous fusion and consistent characterization;
and analyzing the data expression model to obtain a redundant relation and a dimension transformation relation in X dimensions, selecting a data form which needs to be output finally according to an actual task to form a consistent characterization data space with corresponding dimensions, converting the data set with the aligned N dimensions into the data space according to the dimension transformation relation, combining the data set into consistent characterization data after confidence discrimination processing, and outputting the consistent characterization data.
Furthermore, the M multi-source sensors in the first step are M sensors providing data sources, and the types of multi-source heterogeneous data acquired by the sensors include 2D image data, 3D point cloud data, inertial navigation data, astronomical data, temperature data and mechanical data.
Further, the preprocessing in the first step includes any one or more of smoothing denoising, missing value processing, data normalization and chromatic aberration correction, and for a data source not including three-dimensional position information, the three-dimensional space position of the acquired data is estimated according to the sensor installation position and the sensor parameter model and is output as the fixed attribute of the data.
Further, the data-level fusion in the second step is a merging process of the preprocessed data, and includes redundant elimination and data normalization.
Further, the data expression model in the third step refers to a data expression space containing the attribute dimensions of all the data to be fused, and is a union of the original attribute dimensions of all the data to be fused.
Further, the dimension transformation relation in the fourth step refers to a data conversion method between the original data set and the new data set when the original data is changed in data dimension, the data conversion method can change the form and description mode of the original data, but should keep the integrity of data information as much as possible, and the data conversion method includes any one of grid generation, principal component analysis, factor analysis, linear combination and clustering.
Further, the confidence degree judging processing in the fourth step is a processing strategy for the multi-source heterogeneous data when the collision occurs at the same data space point after the multi-source heterogeneous data is converted into the same data description, and the confidence degree parameters include the prior-based data source priority, the original data error statistics in the preprocessing, the data density and the data consistency, and finally the data after the collision is combined is obtained through weighted calculation according to the confidence degree parameters.
The invention has the following beneficial effects:
according to the method, through data acquisition and preprocessing, data fusion of the same type, dimension alignment of a fused data group, and fusion and consistent characterization of heterogeneous data, the problem that fusion of multi-source heterogeneous data is difficult due to the problems of different meanings, different dimensions and the like is solved, and data-level fusion and consistent characterization among single-frame heterogeneous data are realized.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart of a multi-source heterogeneous data single-frame fusion and consistent characterization method provided by the invention.
FIG. 2 is a block diagram illustrating a flow of a multi-source heterogeneous data single frame fusion and consistent characterization method according to an example embodiment.
Detailed Description
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings, examples of which are illustrated in the accompanying drawings. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
As shown in fig. 1, the method for fusing and consistently characterizing a single frame of multi-source heterogeneous data applied to an unmanned system of the present invention includes the following specific steps:
the method comprises the following steps: and data acquisition and preprocessing, namely, carrying out simultaneous data acquisition by using M multi-source sensors, and preprocessing the obtained M data blocks to ensure that the M data blocks all have the description of the three-dimensional spatial position attribute, and the three-dimensional spatial position attributes share the same three-dimensional spatial coordinate system.
Further, as described in conjunction with an embodiment shown in fig. 2, the following sub-steps are performed:
s101: m multisource sensors are fixedly installed on the unmanned system and share the same three-dimensional space coordinate system. Data acquisition of M multi-source sensors is completed in a mode of combining hardware triggering and software synchronization, so that the time consistency precision of all data reaches millisecond order;
specifically, the M multi-source sensors refer to M sensors providing data sources, the types of the sensors may be the same or different, and the types of the multi-source heterogeneous data acquired by the sensors include, but are not limited to, 2D image data, 3D point cloud data, inertial navigation data, astronomical data, temperature data and mechanical data. And synchronously generating a plurality of paths of PWM pulse signals by using the MCU of the lower computer to trigger corresponding sensors, and starting to acquire data by the sensors at the rising edge or the falling edge of the PWM signals. Meanwhile, time delay of sensor data in a signal transmission link is compensated in a software compensation mode, so that the upper computer obtains a frame of multi-source heterogeneous data with the same timestamp at the same moment.
S102: completing preprocessing of multi-source heterogeneous data;
specifically, the preprocessing includes, but is not limited to, performing processes such as smoothing denoising, missing value processing, data normalization, chromatic aberration correction on the acquired multi-source heterogeneous data; meanwhile, for a data source which does not contain three-dimensional space position information, the three-dimensional space position of the acquired data is estimated according to the sensor installation position and the sensor parameter model, and the three-dimensional space position information is output as the fixed attribute of the data.
Step two: fusing data of the same type;
classifying the M preprocessed data blocks according to the sensor types, classifying the data blocks with the same sensor principle or representing the same physical property into the same type of data, and performing data level fusion on the same type of data to obtain N fused data groups which are isomorphic fused, wherein N is the heterogeneous type quantity of heterogeneous data.
Further, as described in conjunction with an embodiment shown in fig. 2, the following sub-steps are performed:
the method comprises the following steps of carrying out feature extraction, feature description, feature matching, correlation calculation, affine transformation and the like on 2D images from different sensors to realize alignment of a plurality of 2D images and fusion of pixel levels, eliminating redundant parts of the images and finally outputting the images in a 2D image form;
3D point cloud data from different sensors are subjected to data level fusion among single-frame point cloud data of a plurality of sensors through the steps of point cloud feature extraction, point cloud registration, point cloud fusion and the like, redundant information is removed, and the 3D point cloud data are finally output in a 3D point cloud mode;
the inertial navigation data, the astronomical data, the temperature data, the mechanical data and other sensing data generated by other sensors generated by a plurality of different sensors are mutually verified, merged, fused and redundantly eliminated, and finally output in a data form.
Step three: the fused data sets are dimensionally aligned.
Constructing an X-dimensional data expression model based on the dimension statistics of the N fused data groups, wherein X is the minimum value capable of covering all different dimensions of each fused data group; and mapping the N fused data sets to the data expression model, and outputting the data sets with N dimensions aligned.
Further, as described in conjunction with an embodiment shown in fig. 2, the following sub-steps are performed:
step S301: in this embodiment, the data dimension of the 3D point cloud is three-dimensional (xyz), the data dimension of the 2D image is five-dimensional (pixel coordinate xy + color information RGB), the data dimension of the thermal infrared is three-dimensional (pixel coordinate xy + degree celsius I), the data dimension of the mechanics is a four-dimensional value (sensor coordinate xyz + newton N), the data dimension of the astronomical data is a two-dimensional angle value (degrees), the inertial navigation data is six-dimensional (velocity angular velocity and acceleration in three directions), and the total is 18-dimensional data, which includes:
[3dx,3dy,3dz,2dx,2dy,R,G,B,I,N,d1,d2,gx,gy,gz,ax,ay,az]。
step S302: a frame of data of multiple sensors is formed into an 18-dimensional vector, the position of the vector in which no valid data exists in the sensors is null, and each frame of data becomes a vector with the same length.
Step four: heterogeneous fusion and consistent characterization.
Analyzing the data expression model to obtain a redundant relation and a dimension transformation relation in X dimensions, selecting a data form which needs to be output finally according to an actual task to form a consistent characterization data space with corresponding dimensions, converting a data set with N aligned dimensions into the data space according to the dimension transformation relation, and merging the data set into consistent characterization data after confidence discrimination processing.
Further, as described in conjunction with an embodiment shown in fig. 2, the following sub-steps are performed:
step S401: dimension reduction is performed on the data vectors in the step S302, for example, principal component analysis and the like are combined with external parameter information such as the installation positions of the sensors, and a redundant relationship and a dimension conversion relationship between the 18-dimensional data in the step S301 are obtained.
Step S402: according to the output requirement of the visualization task, 2D image data can be selected to be directly output; or the 3D point cloud data output in the step is taken as a substrate, the data is subjected to the steps of grid reconstruction, grid correction and the like, and the output 2D image data is mapped on the 3D point cloud to finish the output of the 3D data; or other data are superposed on the 3D data in a layered drawing mode to form the output of multi-source heterogeneous fusion data and form a consistency representation result with corresponding dimensionality.
While the present invention has been described with reference to a preferred embodiment, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (7)

1. A multi-source heterogeneous data single-frame fusion and consistent characterization method applied to an unmanned system is characterized by comprising the following steps:
the method comprises the following steps: data acquisition and pretreatment;
carrying out simultaneous data acquisition by using M multi-source sensors, and preprocessing the obtained M data blocks to ensure that the M data blocks all have the description of the three-dimensional spatial position attribute, and the three-dimensional spatial position attributes share the same three-dimensional spatial coordinate system;
step two: fusing data of the same type;
classifying the preprocessed M data blocks according to the sensor types, classifying the data blocks with the same sensor principle or representing the same physical property into the same type of data, and performing data level fusion on the same type of data to obtain N isomorphic fused data groups, wherein N is the heterogeneous type quantity of heterogeneous data;
step three: dimension alignment of the fused data sets;
carrying out dimension statistics on the N fusion data groups, and constructing an X-dimension data expression model, wherein X is the minimum value capable of covering all different dimensions of each fusion data group; mapping the N fusion data sets to the data expression model, and outputting N data sets with aligned dimensions;
step four: heterogeneous fusion and consistent characterization;
and analyzing the data expression model to obtain a redundant relation and a dimension transformation relation in X dimensions, selecting a data form which needs to be output finally according to an actual task to form a consistent characterization data space with corresponding dimensions, converting the data set with the aligned N dimensions into the data space according to the dimension transformation relation, combining the data set into consistent characterization data after confidence discrimination processing, and outputting the consistent characterization data.
2. The method for single-frame fusion and consistent characterization of multi-source heterogeneous data applied to an unmanned system according to claim 1, wherein the M multi-source sensors in the first step are M sensors providing data sources, and the types of the multi-source heterogeneous data acquired by the first step include 2D image data, 3D point cloud data, inertial navigation data, astronomical data, temperature data and mechanical data.
3. The method for fusing and uniformly characterizing the single multi-source heterogeneous data frame applied to the unmanned system according to claim 1, wherein the preprocessing in the first step includes any one or more of smoothing denoising, missing value processing, data normalization and chromatic aberration correction, and for a data source not including three-dimensional position information, the three-dimensional spatial position of the acquired data is further estimated according to a sensor installation position and a sensor parameter model and output as a fixed attribute of the data.
4. The method for fusing and uniformly characterizing the single multi-source heterogeneous data frame applied to the unmanned system according to claim 1, wherein the data-level fusion in the second step is a merging process of preprocessed data, and the merging process comprises redundancy elimination and data normalization.
5. The method according to claim 1, wherein the data expression model in step three is a data expression space containing attribute dimensions of all data to be fused, and is a union of original attribute dimensions of all data to be fused.
6. The method for fusing and consistently characterizing the multi-source heterogeneous data single frame applied to the unmanned system according to claim 1, wherein the dimension transformation relation in step four refers to a data transformation method between an original data set and a new data set when the original data is changed in data dimension, the data transformation method can change the form and description mode of the original data, but should maintain the integrity of data information as much as possible, and the data transformation method includes any one of grid generation, principal component analysis, factor analysis, linear combination and clustering.
7. The method according to claim 1, wherein the confidence discrimination processing of step four is a processing strategy for the occurrence of a conflict at the same data space point after the multi-source heterogeneous data is converted into the same data description, and the confidence parameters include a priori-based data source priority, raw data error statistics in preprocessing, data density, and data continuity, and finally the data after conflict merging is obtained through weighted calculation according to the confidence parameters.
CN202110690846.2A 2021-06-22 2021-06-22 Multi-source heterogeneous data single-frame fusion and consistent characterization method applied to unmanned system Active CN113408625B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110690846.2A CN113408625B (en) 2021-06-22 2021-06-22 Multi-source heterogeneous data single-frame fusion and consistent characterization method applied to unmanned system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110690846.2A CN113408625B (en) 2021-06-22 2021-06-22 Multi-source heterogeneous data single-frame fusion and consistent characterization method applied to unmanned system

Publications (2)

Publication Number Publication Date
CN113408625A CN113408625A (en) 2021-09-17
CN113408625B true CN113408625B (en) 2022-08-09

Family

ID=77682411

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110690846.2A Active CN113408625B (en) 2021-06-22 2021-06-22 Multi-source heterogeneous data single-frame fusion and consistent characterization method applied to unmanned system

Country Status (1)

Country Link
CN (1) CN113408625B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023064505A1 (en) * 2021-10-14 2023-04-20 Redzone Robotics, Inc. Data translation and interoperability
CN114827209A (en) * 2022-05-07 2022-07-29 南京四维智联科技有限公司 Data acquisition method and device, electronic equipment and storage medium
CN116701962B (en) * 2023-08-07 2023-10-27 北京电科智芯科技有限公司 Edge data processing method, device, computing equipment and storage medium
CN117171534B (en) * 2023-11-03 2024-03-19 济南二机床集团有限公司 Multi-source heterogeneous data acquisition method, system, device and medium for numerical control machine tool
CN117216722B (en) * 2023-11-09 2024-02-27 山东农业大学 Sensor time sequence data-based multi-source heterogeneous data fusion system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102945585A (en) * 2012-11-21 2013-02-27 苏州两江科技有限公司 Method for raising fire alarm through multi-sensor data fusion
CN105893612A (en) * 2016-04-26 2016-08-24 中国科学院信息工程研究所 Consistency expression method for multi-source heterogeneous big data
CN109166149A (en) * 2018-08-13 2019-01-08 武汉大学 A kind of positioning and three-dimensional wire-frame method for reconstructing and system of fusion binocular camera and IMU
CN110873879A (en) * 2018-08-30 2020-03-10 沈阳航空航天大学 Device and method for deep fusion of characteristics of multi-source heterogeneous sensor
CN111046245A (en) * 2019-12-11 2020-04-21 杭州趣链科技有限公司 Multi-source heterogeneous data source fusion calculation method, system, equipment and storage medium
CN111753024A (en) * 2020-06-24 2020-10-09 河北工程大学 Public safety field-oriented multi-source heterogeneous data entity alignment method
CN111950627A (en) * 2020-08-11 2020-11-17 重庆大学 Multi-source information fusion method and application thereof
CN112634451A (en) * 2021-01-11 2021-04-09 福州大学 Outdoor large-scene three-dimensional mapping method integrating multiple sensors

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8290741B2 (en) * 2010-01-13 2012-10-16 Raytheon Company Fusing multi-sensor data sets according to relative geometrical relationships
CN104010370B (en) * 2014-04-28 2019-07-09 北京邮电大学 Heterogeneous system fused controlling method and device
US11874676B2 (en) * 2019-11-22 2024-01-16 JAR Scientific, LLC Cooperative unmanned autonomous aerial vehicles for power grid inspection and management

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102945585A (en) * 2012-11-21 2013-02-27 苏州两江科技有限公司 Method for raising fire alarm through multi-sensor data fusion
CN105893612A (en) * 2016-04-26 2016-08-24 中国科学院信息工程研究所 Consistency expression method for multi-source heterogeneous big data
CN109166149A (en) * 2018-08-13 2019-01-08 武汉大学 A kind of positioning and three-dimensional wire-frame method for reconstructing and system of fusion binocular camera and IMU
CN110873879A (en) * 2018-08-30 2020-03-10 沈阳航空航天大学 Device and method for deep fusion of characteristics of multi-source heterogeneous sensor
CN111046245A (en) * 2019-12-11 2020-04-21 杭州趣链科技有限公司 Multi-source heterogeneous data source fusion calculation method, system, equipment and storage medium
CN111753024A (en) * 2020-06-24 2020-10-09 河北工程大学 Public safety field-oriented multi-source heterogeneous data entity alignment method
CN111950627A (en) * 2020-08-11 2020-11-17 重庆大学 Multi-source information fusion method and application thereof
CN112634451A (en) * 2021-01-11 2021-04-09 福州大学 Outdoor large-scene three-dimensional mapping method integrating multiple sensors

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Multi-source heterogeneous data fusion;L. Zhang et al.;《2018 International Conference on Artificial Intelligence and Big Data》;20180628;第47-51页 *
基于改进型支持度函数的畜禽养殖物联网数据融合方法;段青玲 等;《农业工程学报》;20170228;第33卷;第239-245页 *

Also Published As

Publication number Publication date
CN113408625A (en) 2021-09-17

Similar Documents

Publication Publication Date Title
CN113408625B (en) Multi-source heterogeneous data single-frame fusion and consistent characterization method applied to unmanned system
CN111583663B (en) Monocular perception correction method and device based on sparse point cloud and storage medium
CN107430776B (en) Template manufacturing device and template manufacturing method
CN105931240A (en) Three-dimensional depth sensing device and method
JP6883608B2 (en) Depth data processing system that can optimize depth data by aligning images with respect to depth maps
JP7205613B2 (en) Image processing device, image processing method and program
JP4737763B2 (en) Free viewpoint image generation method, apparatus and program using multi-viewpoint images
CN109917419B (en) Depth filling dense system and method based on laser radar and image
CN105335955A (en) Object detection method and object detection apparatus
CN109147025B (en) RGBD three-dimensional reconstruction-oriented texture generation method
CN115116049B (en) Target detection method and device, electronic equipment and storage medium
CN102036094A (en) Stereo matching method based on digital score delay technology
CN115147545A (en) Scene three-dimensional intelligent reconstruction system and method based on BIM and deep learning
CN115546741A (en) Binocular vision and laser radar unmanned ship marine environment obstacle identification method
CN113643436B (en) Depth data splicing and fusion method and device
EP2913793B1 (en) Image processing device and image processing method
CN117274749B (en) Fused 3D target detection method based on 4D millimeter wave radar and image
CN108460724B (en) Adaptive image fusion method and system based on Mahalanobis distance discrimination
CN110197104B (en) Distance measurement method and device based on vehicle
CN112489189B (en) Neural network training method and system
US11494934B2 (en) Image processing device, image processing method, and monitoring system
US11941851B2 (en) Systems and methods for calibrating imaging and spatial orientation sensors
CN115761801A (en) Three-dimensional human body posture migration method based on video time sequence information
CN112487893B (en) Three-dimensional target identification method and system
CN109089100B (en) Method for synthesizing binocular stereo video

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant