CN112003891B - Multi-sensing data fusion method for intelligent networked vehicle controller - Google Patents

Multi-sensing data fusion method for intelligent networked vehicle controller Download PDF

Info

Publication number
CN112003891B
CN112003891B CN202010683438.XA CN202010683438A CN112003891B CN 112003891 B CN112003891 B CN 112003891B CN 202010683438 A CN202010683438 A CN 202010683438A CN 112003891 B CN112003891 B CN 112003891B
Authority
CN
China
Prior art keywords
data
sensor
sensors
weight
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010683438.XA
Other languages
Chinese (zh)
Other versions
CN112003891A (en
Inventor
罗映
李丙洋
罗全巧
沈学会
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Promote Electromechanical Technology Co ltd
Original Assignee
Shandong Netlink Intelligent Vehicle Industry Technology Research Institute Co ltd
Shandong Promote Electromechanical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Netlink Intelligent Vehicle Industry Technology Research Institute Co ltd, Shandong Promote Electromechanical Technology Co ltd filed Critical Shandong Netlink Intelligent Vehicle Industry Technology Research Institute Co ltd
Priority to CN202010683438.XA priority Critical patent/CN112003891B/en
Publication of CN112003891A publication Critical patent/CN112003891A/en
Application granted granted Critical
Publication of CN112003891B publication Critical patent/CN112003891B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y20/00Information sensed or collected by the things
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Indication And Recording Devices For Special Purposes And Tariff Metering Devices (AREA)
  • Small-Scale Networks (AREA)
  • Traffic Control Systems (AREA)

Abstract

The method comprises the steps of presetting the basic weight of a sensor and carrying out filtering correction on sensor data acquired by all the sensors; screening data by determining an effective data fusion set of the sensors, further obtaining dynamic weight values through similarity measurement, finally calculating comprehensive weights according to the basic weights and the dynamic weights, and performing weighted fusion calculation on different sensor data according to the comprehensive weights to obtain fusion data of a plurality of sensors. According to the method, the accuracy of the input data is ensured by filtering and correcting the acquired data, and further the later-period calculated amount is reduced by screening the effective data; the data fusion method adopted by the invention can adjust the data fusion weights of different sensors in time under different perception scenes by setting the dynamic weights, so as to obtain more accurate perception data.

Description

Multi-sensing data fusion method for intelligent networked vehicle controller
Technical Field
The invention relates to the field of unmanned vehicle control, in particular to a multi-sensing data fusion method for an intelligent network vehicle controller.
Background
The intelligent networked vehicle acquires environmental information through various sensors, and performs fusion processing through a vehicle controller after receiving sensor data to finally form environmental perception data so as to provide decision basis for ensuring safe driving of the vehicle. The intelligent networked vehicle needs to work through cooperation of multiple sensors in the environment sensing process, data fusion among different sensors is always a research hotspot of the networked vehicle, in the existing data fusion process, the use weight of data acquired by the sensors in the fusion process is mostly determined according to the grades of the sensors, and the division of the sensor grades is preset in advance according to the categories of the sensors, so that the corresponding sensor grades are fixed all the time, the use weight of the sensor data cannot change in various scenes, and in some scenes, inaccurate data fusion easily occurs, the phenomenon of deviation of the environment sensing occurs, and the decision correctness of a controller is finally influenced.
Disclosure of Invention
In order to solve the technical problem, the invention provides a multi-sensing data fusion method for an intelligent networked vehicle controller, which can adjust the weight of data according to the environment and determine an optimal fusion data set.
The invention is realized by the following method: the multi-sensing data fusion method for the intelligent networked vehicle controller comprises the following steps:
s1: presetting the basic weight of the sensor;
comparing each sensor with other sensors respectively according to an AHP analysis method, judging the grade of the sensor, wherein the comparison factors mainly comprise the following steps: the sensor precision, detection range and the like are assigned according to the importance level of the data acquired by a certain sensor relative to the data of other sensors.
According to the grade empowerment comparison, the kth sensor is compared with other sensors such as k +1, k +2 and the like one by one to obtain a weight matrix U consisting of a plurality of empowerments 1
By calculating a weight matrix U 1 The maximum characteristic root of the sensor is obtained to obtain a corresponding characteristic vector which is the basic weight e of the sensor 1
S2: filtering and correcting the sensor data acquired by all the sensors;
the specific steps of filtering and correcting comprise:
s201: collecting a data set;
assuming that the number of vehicle sensors capable of monitoring the same target is n, the measurement frequency of each sensor is m, wherein k is more than or equal to 1 and is less than or equal to n, i is more than or equal to 1 and is less than or equal to m, the ith measurement data of the kth sensor is recorded as f (k, i), and a data set is utilizeda k Represents all data acquired by the kth sensor, a k ={f(k,1),f(k,2),f(k,3)……f(k,i)……f(k,m)}。
S202: computing a data set a k The probability P of occurrence of each data f (k, i) ki
According to the probability density function, the raw data f (k, i) and the mean value of the ith and the ith previous s measurement data of the kth sensor
Figure GDA0003576138820000022
The following relationships exist:
Figure GDA0003576138820000021
in the above formula, γ represents a standard deviation of measurement noise, and measurement noise of different sensors is different, and it is determined according to comparison between various actual parameters of a known target and parameters obtained by the sensors and statistical analysis of data. The initial determination may be empirically estimated and later revised step by step with increasing collected comparison data. The initial estimate may be based on the following equation:
Figure GDA0003576138820000031
further calculating the i-th and i-th previous s measurement data of the kth sensor in the data set a k The probability of occurrence of (c).
Figure GDA0003576138820000032
S203: based on the raw measurement data f (k, i) and its data at a k Probability of occurrence of P ki Correcting the data to obtain corrected data f 0 (k,i)。
When a plurality of data are collected, correcting the data to obtain corrected data:
Figure GDA0003576138820000033
s3: determining a valid data fusion set of sensors;
and selecting an effective data set by judging the effectiveness of the data corrected by the sensor.
Specifically, the measured data of the same target object obtained by the same sensor k is corrected to obtain a 0k Wherein a is 0k Containing a plurality of measurement data, a 0k ={f 0 (k,1),f 0 (k,2),f 0 (k,3)……f 0 (k,i)},
Calculating the acquisition data a in the sensor k 0k Each measured data in a 0k Of the absolute value Δ l of the difference between the other measurement data ki And further calculates Δ l ki Mean value of k Δl ki A 1 is mixing E k Δl ki Referred to as the mean value of the error of the corrected data for sensor k.
Calculating the error mean values of the correction data of different sensors, further averaging to obtain a data fusion base number lambda after obtaining the error mean values of the correction data of a plurality of sensors,
the data fusion cardinality λ is a standard constant that determines efficient data fusion. The radix λ calculation method is as follows:
Figure GDA0003576138820000041
according to the above, correcting the data error mean value E k Δl ki <Or when the data of the sensor is determined to be valid data which can be merged.
S4: obtaining a dynamic weight value through similarity measurement;
s401: calculating the Jacard similarity coefficient of the sensor;
in this step, the jaccard similarity coefficient is calculated from the jaccard distance, and a variable weight value is obtained.
Obtaining Jacard similarity of a kth sensor to a (k + 1) th sensorCoefficient R 1
Figure GDA0003576138820000042
In the above formula, G 1 、G 2 、G 3 Representing the number of elements, wherein the number of elements is confirmed according to the following method:
will matrix U 2 Element(s) in (b) and comparison matrix U 3 Corresponding element values are compared, G 1 Indicates the number of elements, G, whose corresponding elements are all 1 2 Representation matrix U 2 The middle element is 1 and the matrix U 3 The corresponding element in (1) is the number of elements of 0, G 3 Representation matrix U 2 The middle element is 0 and the matrix U 3 The corresponding element in (1) is the number of elements.
Similarly, according to the method, the Jacard similarity coefficient R of the kth sensor relative to the (k + 2) th sensor is obtained 2
S402: obtaining dynamic weight of the sensor through the Jacard similarity coefficient;
obtaining n-1 relative Jacard similarity coefficients of the kth sensor and calculating an average value to obtain a dynamic weight e of the kth sensor 2 . Since the effective data fusion set of the sensor is determined in the previous step, and the data set exceeding the standard constant of the effective data fusion is screened and excluded, the data discreteness in the step is weak, the standard deviation is small, and the average value of the Jacard similarity coefficient can be used as the dynamic weight e 2
S5: calculating a comprehensive weight e for fusing sensor data, and performing weighted fusion on the multi-sensor data according to the comprehensive weight to obtain final fusion data;
calculating the comprehensive weight e:
e=c*e 1 +(1-c)e 2
wherein c represents a basic weight adjustment coefficient, which is determined according to the particularity and importance degree of the sensor, and generally takes a value of 0.5, the more important the sensor data type is, the larger the value is, and e 1 Represents the basis weight, e 2 Representing a variable weight.
Has the advantages that: according to the method, the accuracy of the input data is ensured by filtering and correcting the acquired data, and the later-period calculated amount is reduced by screening the effective data; according to the data fusion method adopted by the invention, the data weight comprises a basic weight and a dynamic weight, the data fusion weights of different sensors are timely adjusted under different sensing scenes to obtain more accurate sensing data, and the data fusion defect of adopting a fixed fusion weight in the prior art is overcome.
Drawings
FIG. 1 is a schematic flow chart of the present invention.
Detailed Description
The method of the invention will now be described more fully with reference to the examples, but it is understood that all variations that can be obtained by a person skilled in the art without inventive step based on the method of the invention fall within the scope of protection of the invention.
The multi-sensing data fusion method for the intelligent networked vehicle controller comprises the following steps:
s1: presetting the basic weight of the sensor;
specifically, the method comprises the following steps:
s101: comparing each sensor with other sensors respectively according to an AHP analysis method, and judging the grade of the sensor, wherein the comparison factors mainly comprise the following steps: the sensor precision, detection range and the like are assigned according to the importance level of the data acquired by a certain sensor relative to the data of other sensors.
Assuming that the kth sensor is compared with the (k + 1) th sensor, the importance levels are classified and weighted as follows:
Figure GDA0003576138820000061
Figure GDA0003576138820000071
in the above-mentioned assignment table, 1<b 1 <b 2 <b 3 <b 4 <b 5 ,0<b 9 <b 8 <b 7 <b 6 <1
According to the grade empowerment comparison, the kth sensor is compared with other sensors such as k +1, k +2 and the like one by one to obtain a weight matrix U consisting of a plurality of empowerments 1
S102: calculating a basic weight;
by calculating a weight matrix U 1 The maximum feature root of the sensor to obtain a corresponding feature vector, which is the basic weight e of the sensor 1
S2: filtering and correcting the sensor data acquired by all the sensors;
the specific steps of filtering and correcting comprise:
s201: collecting a data set;
assuming that the number of vehicle sensors capable of monitoring the same target is n, the measurement frequency of each sensor is m, wherein k is more than or equal to 1 and is less than or equal to n, i is more than or equal to 1 and is less than or equal to m, the ith measurement data of the kth sensor is recorded as f (k, i), and a data set a is utilized k Representing all data acquired by the kth sensor, a k (f (k,1), f (k,2), f (k,3) … … f (k, i) … … f (k, m) }. In addition, in a k Several of the same measurement data may appear in (1), such as: f (k,1) ═ f (k, 2).
S202: computing a data set a k The probability P of occurrence of each data f (k, i) in ki
According to the probability density function, the raw data f (k, i) and the mean value of the ith and the ith previous s measurement data of the kth sensor
Figure GDA0003576138820000073
Exist as followsThe relation is as follows:
Figure GDA0003576138820000072
in the above formula, γ represents a standard deviation of measurement noise, and measurement noise of different sensors is different, and it is determined according to comparison between various actual parameters of a known target and parameters obtained by the sensors and statistical analysis of data. The initial determination can be estimated empirically and later corrected progressively as the collected comparison data increases. The initial estimate may be based on the following equation:
Figure GDA0003576138820000081
in the above-mentioned formula, the compound of formula,
Figure GDA0003576138820000082
the mean value of s measurement data of the ith sensor and the ith sensor before the ith sensor is calculated according to a maximum likelihood estimation algorithm, and the calculation method comprises the following steps:
Figure GDA0003576138820000083
further calculating the i-th and i-th previous s measurement data of the kth sensor in the data set a k The probability of occurrence of (a).
Figure GDA0003576138820000084
S203: from the raw measurement data f (k, i) and its data at a k Probability of occurrence of P ki Correcting the data to obtain corrected data f 0 (k,i);
When a plurality of data are collected, correcting the data to obtain corrected data:
Figure GDA0003576138820000085
s3: determining a valid data fusion set of sensors;
and selecting an effective data set by judging the effectiveness of the data corrected by the sensor.
Specifically, a is obtained by correcting the measurement data of the same target object obtained by the same sensor k 0k Wherein a is 0k Containing a plurality of measurement data, a 0k ={f 0 (k,1),f 0 (k,2),f 0 (k,3)……f 0 (k,i)},
Calculating the acquisition data a in the sensor k 0k Each measured data in a 0k Of the difference between the other measurement data ki And further calculates Δ l ki Mean value of k Δl ki A 1 is mixing E k Δl ki Referred to as the mean value of the error of the corrected data for sensor k.
Calculating the error mean values of the correction data of different sensors, further averaging to obtain a data fusion base number lambda after obtaining the error mean values of the correction data of a plurality of sensors,
the data fusion cardinality λ is a standard constant that determines efficient data fusion. The radix λ calculation method is as follows:
Figure GDA0003576138820000091
error mean value E of corrected data when k sensor measured data k Δl ki Above λ, the sensor data exceeds the standard constant for valid data fusion and therefore does not belong to fusible data.
According to the above, the data error mean value E is corrected k Δl ki <Or λ, the data of the sensor is determined as valid data which can be fused.
S4: obtaining a dynamic weight value through similarity measurement;
s401: calculating the Jacard similarity coefficient of the sensor;
in this step, the jaccard similarity coefficient is calculated from the jaccard distance, and a variable weight value is obtained.
The jaccard distance measures the discrimination between two sets by the ratio of different elements in the two sets to all elements. Specifically, data (data within a standard constant of valid data fusion) acquired by each sensor at each time is compared with data (data within a standard constant of valid data fusion) acquired by other sensors at the time, comparison results are respectively represented by 1 and 0, wherein 1 represents the same, 0 represents different, for example, a k-th sensor is compared with a k + 1-th sensor and a k + 2-th sensor, and if each sensor performs 3 measurements in total, 3 measurement data are obtained and corrected according to the steps, and a measurement data set acquired by the k-th sensor is corrected to be a i ={a 1 ,a 2 ,a 3 And b is obtained after the measurement data set obtained by the (k + 1) th sensor is corrected i ={b 1 ,b 2 ,b 3 D is obtained by correcting a measurement data set obtained by the (k + 2) th sensor i ={d 1 ,d 2 ,d 3 In turn, the data (corrected) a obtained for the first time by the kth sensor 1 Are each independently of b 1 、d 1 Comparing, and obtaining data a (after correction) for the second time by the kth sensor 2 Are each independently of b 2 、d 2 Making a comparison of 3 Respectively with b 3 、d 3 Comparing, and outputting the comparison result through a number 1 or 0 to obtain a comparison matrix U of the kth sensor 2 The comparison matrix includes the comparison result of the kth sensor and all the measurement data of all the other sensors, and similarly, the comparison matrix U of the kth +1 th sensor can be obtained 3 Comparison matrix U of (k + 2) th sensor 4
Acquiring the Jacard similarity coefficient R of the kth sensor relative to the (k + 1) th sensor 1
Figure GDA0003576138820000101
In the above formula, G 1 、G 2 、G 3 Representing the number of elements, wherein the number of elements is confirmed according to the following method:
will matrix U 2 Element(s) in (b) and comparison matrix U 3 Corresponding element values are compared, G 1 Indicates the number of elements whose corresponding elements are all 1, G 2 Representation matrix U 2 The middle element is 1 and the matrix U 3 The corresponding element in (1) is the number of elements of 0, G 3 Representation matrix U 2 The middle element is 0 and the matrix U 3 The corresponding element in (1) is the number of elements.
Similarly, according to the method, the Jacard similarity coefficient R of the kth sensor relative to the (k + 2) th sensor is obtained 2
Further, in the comparison method of this step, the partial data appearing in step S3, that is, the data exceeding the standard constant for effective data fusion, is all represented by 0 during comparison.
S402: obtaining dynamic weight of the sensor through the Jacard similarity coefficient;
obtaining n-1 relative Jacobs's similarity coefficients of the kth sensor and calculating an average value to obtain a dynamic weight e of the kth sensor 2 . Since the effective data fusion set of the sensor is determined in the previous step, and the data set exceeding the standard constant of the effective data fusion is screened and excluded, the data discreteness in the step is weak, the standard deviation is small, and the average value of the Jacard similarity coefficient can be used as the dynamic weight e 2
Further, if necessary, the Jacard similarity coefficient of the sensor may be normalized first, and then the dynamic weight may be obtained through mean value calculation.
S5: calculating a sensor fusion comprehensive weight e, and performing weighted fusion calculation on different sensor data according to the comprehensive weight to obtain fusion data of a plurality of sensors;
calculating the comprehensive weight e:
e=c*e 1 +(1-c)e 2
wherein c represents a basic weight adjustment coefficient, which is determined according to the particularity and the importance degree of the sensor, generally, the value is 0.5, the more important the sensor data type is, the larger the value is, and e 1 Represents the basis weight, e 2 Representing a variable weight. And performing weighted fusion calculation on different sensor data according to the comprehensive weight of each sensor to finally obtain fusion data.
According to the method, the accuracy of the input data is ensured by filtering and correcting the acquired data, and further the later-period calculated amount is reduced by screening the effective data; according to the data fusion method adopted by the invention, the data weight comprises a basic weight and a dynamic weight, and under different perception scenes, the data fusion weights of different sensors are timely adjusted to obtain more accurate perception data.

Claims (1)

1. The multi-sensing data fusion method for the intelligent networked vehicle controller is characterized by comprising the following steps of:
s1: presetting the basic weight of the sensor; the presetting of the basic weight of the sensor in S1 includes:
s101: comparing each sensor with other sensors respectively according to an AHP analysis method, judging the grade of the sensor, and assigning values to the data obtained by a certain sensor according to the importance grade of the data relative to the data of other sensors to obtain a weight matrix U 1
S102: by calculating a weight matrix U 1 The maximum characteristic root of the sensor is obtained to obtain a corresponding characteristic vector which is the basic weight e of the sensor 1
S2: filtering and correcting the sensor data acquired by all the sensors;
s3: determining a valid data fusion set of sensors;
s4: obtaining a dynamic weight value through similarity measurement; the method specifically comprises the following steps:
s401: calculating the Jacard similarity coefficient of the sensor;
the step S401 includes: comparing the data acquired by each sensor with the data acquired by other sensors at the time to obtain multiple comparison matrixes, comparing the corresponding elements in the comparison matrixes to obtain Jacard similarity coefficients,
s402: obtaining dynamic weight of the sensor through the Jacard similarity coefficient;
the S402 includes: obtaining n-1 relative Jacard similarity coefficients of the kth sensor and calculating an average value to obtain a dynamic weight e of the kth sensor 2 Wherein n is the total number of sensors;
s5: calculating a comprehensive weight e, and performing weighted fusion calculation on different sensor data according to the comprehensive weight to obtain fusion data of a plurality of sensors; the calculation method of the comprehensive weight e in the step S5 is as follows:
e=c*e 1 +(1-c)e 2
wherein c represents a basic weight adjustment coefficient determined according to the specificity and importance of the sensor, e 1 Represents the basis weight, e 2 Representing dynamic weights.
CN202010683438.XA 2020-07-16 2020-07-16 Multi-sensing data fusion method for intelligent networked vehicle controller Active CN112003891B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010683438.XA CN112003891B (en) 2020-07-16 2020-07-16 Multi-sensing data fusion method for intelligent networked vehicle controller

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010683438.XA CN112003891B (en) 2020-07-16 2020-07-16 Multi-sensing data fusion method for intelligent networked vehicle controller

Publications (2)

Publication Number Publication Date
CN112003891A CN112003891A (en) 2020-11-27
CN112003891B true CN112003891B (en) 2022-09-06

Family

ID=73467344

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010683438.XA Active CN112003891B (en) 2020-07-16 2020-07-16 Multi-sensing data fusion method for intelligent networked vehicle controller

Country Status (1)

Country Link
CN (1) CN112003891B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022115993A1 (en) * 2020-12-01 2022-06-09 Robert Bosch Gmbh Method and apparatus for tuning sensor fusion weights
CN113447671B (en) * 2021-07-15 2022-09-23 中煤科工集团重庆研究院有限公司 Roadway section wind speed detection method based on high-frequency and low-frequency ultrasonic waves

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3013834B1 (en) * 2013-11-28 2015-12-25 Airbus Operations Sas METHOD FOR MERGING SENSOR DATA USING A COHERENCE CRITERION
CN103926564B (en) * 2014-03-21 2016-08-24 成都民航空管科技发展有限公司 A kind of multi-source monitors fusion method and the device of data
CN105136264B (en) * 2015-09-29 2017-09-22 北京万集科技股份有限公司 It is a kind of to combine the weight acquisition methods and device, weighing system weighed based on multi-site
CN106408983B (en) * 2016-10-10 2019-01-04 上海宏英智能科技有限公司 A kind of Vehicular automatic driving system
CN110020394B (en) * 2017-08-01 2023-07-18 广州极飞科技股份有限公司 Data processing method and device
CN108573270B (en) * 2017-12-15 2020-04-28 上海蔚来汽车有限公司 Method and apparatus for synchronizing multi-sensor target information fusion and multi-sensor sensing, computer device, and recording medium
CN108960334B (en) * 2018-07-12 2021-09-14 中国人民解放军陆军炮兵防空兵学院郑州校区 Multi-sensor data weighting fusion method
CN109784664A (en) * 2018-12-20 2019-05-21 广东广业开元科技有限公司 A kind of external dependence degree calculation method, system and device based on changeable weight
CN109766958B (en) * 2019-04-12 2019-07-05 江苏量动信息科技有限公司 A kind of data preprocessing method and device for data fusion
CN110135952B (en) * 2019-05-16 2022-07-19 深圳市梦网视讯有限公司 Commodity recommendation method and system based on class similarity
CN110389971A (en) * 2019-06-28 2019-10-29 长春工程学院 A kind of multi-Sensor Information Fusion Approach based on cloud computing
CN110324336B (en) * 2019-07-02 2021-07-30 成都信息工程大学 Internet of vehicles data situation sensing method and device based on network security

Also Published As

Publication number Publication date
CN112003891A (en) 2020-11-27

Similar Documents

Publication Publication Date Title
CN112003891B (en) Multi-sensing data fusion method for intelligent networked vehicle controller
CN109131342B (en) Method and device for measuring speed by fusing acceleration sensor and wheel axle speed sensor
CN117278643B (en) Vehicle-mounted cloud calibration data transmission system based on cloud edge cooperation
CN117196353B (en) Environmental pollution assessment and monitoring method and system based on big data
CN107111800B (en) Part manufacturing method based on analysis of weighted statistical indicators
CN111951430B (en) Vehicle drivability evaluation method and system
CN116600104B (en) Phase acquisition quality analysis method and system for IPC network camera
CN110287537B (en) Wild value resistant self-adaptive Kalman filtering method for frequency standard output jump detection
CN112734858B (en) Binocular calibration precision online detection method and device
CN111967717A (en) Data quality evaluation method based on information entropy
CN117349683A (en) Auto-parts application colour difference anomaly detection system based on spectral data
CN113935535A (en) Principal component analysis method for medium-and-long-term prediction model
CN109669849B (en) Complex system health state assessment method based on uncertain depth theory
CN117313020B (en) Data processing method of bearing type tension sensor
CN110852906A (en) Method and system for identifying electricity stealing suspicion based on high-dimensional random matrix
US20230214660A1 (en) Hybrid training method for self-learining algorithms
CN115526276A (en) Wind tunnel balance calibration load prediction method with robustness
CN113712245B (en) Cigarette circumference accurate control method
CN107274036B (en) Crop yield prediction method and system
CN112747773B (en) Method for improving precision of gyroscope based on Allan variance and random polynomial
CN114897772A (en) Method for regulating and controlling positive vulcanization of rubber based on machine vision
JP3035089B2 (en) Process characteristic estimation method and apparatus, process monitoring method and process control method using the estimation method
CN114330553A (en) Digital acquisition system calibration method based on supervised learning
CN116208152B (en) High-precision analog quantity acquisition device and method
CN111415715A (en) Intelligent correction method, system and device based on multivariate spectral data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220705

Address after: 1406-1, floor 14, building a7-5, Hanyu Golden Valley, No. 7000, Jingshi Road, Jinan area, China (Shandong) pilot Free Trade Zone, Jinan, Shandong Province, 250101

Applicant after: SHANDONG PROMOTE ELECTROMECHANICAL TECHNOLOGY CO.,LTD.

Applicant after: Shandong netlink Intelligent Vehicle Industry Technology Research Institute Co.,Ltd.

Address before: 250100 No. 1406, 14th floor, Xinlian science and technology building, building a7-5, Hanyu Golden Valley, No. 7000, Jingshi Road, Jinan area, China (Shandong) pilot Free Trade Zone, Jinan, Shandong Province

Applicant before: Shandong netlink Intelligent Vehicle Industry Technology Research Institute Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A multi-sensor data fusion method for intelligent networked vehicle controllers

Effective date of registration: 20231019

Granted publication date: 20220906

Pledgee: Jinan Branch of Qingdao Bank Co.,Ltd.

Pledgor: SHANDONG PROMOTE ELECTROMECHANICAL TECHNOLOGY CO.,LTD.

Registration number: Y2023370000116