CN117034201A - Multi-source real-time data fusion method - Google Patents
Multi-source real-time data fusion method Download PDFInfo
- Publication number
- CN117034201A CN117034201A CN202311291695.9A CN202311291695A CN117034201A CN 117034201 A CN117034201 A CN 117034201A CN 202311291695 A CN202311291695 A CN 202311291695A CN 117034201 A CN117034201 A CN 117034201A
- Authority
- CN
- China
- Prior art keywords
- data
- time
- sensor
- acquired
- sensors
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 6
- 238000000034 method Methods 0.000 claims description 21
- 238000004364 calculation method Methods 0.000 claims description 18
- 230000004927 fusion Effects 0.000 claims description 7
- 239000003208 petroleum Substances 0.000 abstract description 3
- 238000005070 sampling Methods 0.000 abstract description 2
- 238000011156 evaluation Methods 0.000 abstract 1
- 238000005553 drilling Methods 0.000 description 14
- 230000005540 biological transmission Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/251—Fusion techniques of input or preprocessed data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2123/00—Data types
- G06F2123/02—Data types in the time domain, e.g. time-series data
Landscapes
- Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Geophysics And Detection Of Objects (AREA)
Abstract
The invention discloses a multi-source real-time data fusion method, which relates to the technical field of information and the field of petroleum exploration, and realizes time synchronization of multi-source data by fusing data transmitted by all sensors in real time, thereby solving the problem that due to inconsistent acquisition frequency or non-uniform sampling mode of each sensor, the acquisition time of various sensor data is inconsistent, and the correct evaluation of the current working condition cannot be carried out by staff is affected.
Description
Technical Field
The invention belongs to the technical field of information and the field of petroleum exploration, and particularly relates to a multi-source real-time data fusion method.
Background
In the petroleum exploitation well drilling process, the sensors are key devices for monitoring various parameters in well drilling operation, the sensors can monitor important parameters such as temperature, humidity, pressure and the like on a well drilling platform in real time, ensure the safety of well drilling operation, assist workers to monitor working conditions in real time, prevent accidents, play an important role in modern well drilling operation, and provide firm guarantee for efficient and safe well drilling operation.
The drilling platform center receives real-time drilling data acquired by the sensors, and staff performs statistics and analysis on the basis of the data to timely know the operation progress, and take effective evading measures for problems and risks possibly encountered in the drilling process, so that safe production is guaranteed. However, because the acquisition frequencies of the sensors are inconsistent, even some sensors receive data in a non-uniform sampling manner, so that real-time data received by the center of the drilling platform is not data at the same moment, a small time error may cause a larger error in space in the drilling process, and the acquisition time of the data is not matched, so that a worker cannot accurately evaluate the current working condition.
Disclosure of Invention
In view of the above problems, the present invention provides a multi-source real-time data fusion method, as shown in fig. 1, which performs real-time fusion on collected data of various sensors, and realizes time synchronization on data from different data sources, so as to assist staff in timely and correct assessment on current working conditions by improving accuracy of drilling data.
To achieve the above object, the principle of the present invention is as follows: the various sensors add acquisition time when outputting acquisition signals, and perform time calibration on all the sensors before starting a work task, so that the time is kept consistent. In the working task process, one sensor is taken as a base and is recorded as a sensor A, and the acquisition time of the latest two frames of data received by the sensor A in real time is taken as the acquisition time of the latest two frames of data、Provided that is a condition in which<Respectively calculate atData of other sensors at the moment, finally toAnd combining all the sensor data at the moment to obtain fusion data meeting the time synchronization requirement.
Compared with the prior art, the invention has the beneficial effects that: the invention solves the problem that the accuracy of drilling data is affected due to the fact that the time of signals acquired by each sensor is not synchronous in the drilling process, and the invention can calculate the transmission data of all the sensors at the same moment and meet the requirements of time synchronization and instantaneity, thereby improving the accuracy of the data and timely and correctly evaluating the current working condition.
Drawings
FIG. 1 is a schematic diagram of different data sources for performing data fusion;
FIG. 2 is a schematic view of the processCalculating a schematic diagram of a method when data are not collected in a time interval;
FIG. 3 is a comparison of sensor B calculated data with collected data;
fig. 4 shows the comparison of the calculated data and the acquired data of the sensor C.
Detailed Description
In order that the manner in which the above recited objects, features and advantages of the present invention are obtained will become more readily apparent, a more particular description of the invention briefly described above will be rendered by reference to the appended drawings.
All the sensors are set before starting a work task according to different functions and different acquired parameter indexes, the additional acquisition time is set when the sensors output acquisition signals, and the time calibration is carried out on all the sensors, so that the time consistency is maintained.
The acquisition data of other sensors are calculated based on the sensor A, and each frame of data output by the sensor A is recorded as follows:where t represents the time at which sensor A acquired the data,the data value of the j-th parameter at time t is represented. Taking a sensor B as an example, calculating the transmission and collection data of other sensors, and recording each frame of data output by the sensor B as follows:where t represents the time at which sensor B acquired the data,the data value of the j-th parameter at time t is represented.
The latest two frames of data acquired by the sensor A in real time are recorded as follows:andwherein<The following cases are calculatedData of sensor B at timeWhereinFor the j-th parameterCalculated value of time.
(1) When inWhen the time sensor B collects data, namely the collected data is thatThenI.e.Wherein;
(2) When inThe time sensor B does not collect data, whenWhen one or more frames of data are acquired in the time interval, the data are acquiredInterval inner distanceTime of day closestThe acquired data is recorded asGet atData is collected in time interval and is distant fromTime of day closestThe acquired data is recorded asCalculated according to interpolation methodTime of day dataThe calculation method is as follows:
wherein;
(3) When inThe time sensor B does not collect data, whenWhen no data is acquired in the time interval, the data is acquiredData is collected in interval and is distant fromTime of day closestThe acquired data is recorded as:
wherein the method comprises the steps of<<Then atTime of day dataThe calculation steps of (1) are as follows, as shown in FIG. 2:
step 1, according to the following stepsTime of day acquisition dataRespectively calculating by interpolation methodTime of day dataAnd atTime of day dataThe calculation method is as follows:
wherein the method comprises the steps of;
Step 2, calculating inTime of day data acquisitionAnd calculating dataDeviation value betweenThe calculation method is as follows:
wherein;
Step 3, calculating inDeviation of time of dayThe calculation method is as follows:
wherein;
Step 4, calculating inSensor B data of time of dayThe calculation method is as follows:
wherein;
The method realizes the calculation of the position of the sensor BTime of day dataThe method of calculating other sensor data may refer to the above method, and will not be described again.
Finally, toCombining the data of all sensors at the moment to obtain fusion data meeting the requirements of time synchronization and real-time performanceThe data combination is as follows:
the following describes the above embodiment by way of a specific example:
the time set of each frame of data acquired by the sensor A is T= [10,12,14,16,18,20,22,24,26,28,30,32,34,36,38,40,42,44,46,48,50] with the unit of seconds; the data collected by the sensor B and the sensor C are respectively as follows:
by adopting the method of the invention, the data corresponding to all moments of the sensor B and the sensor C in the time set T are respectively calculated, and the calculation results are as follows:
fig. 3 is a comparison of the calculation result of the sensor B and the acquired data, wherein the symbol "." in the figure indicates the acquired value, and the symbol "x" indicates the calculated value;
fig. 4 shows a comparison of the calculation result of the sensor C and the acquired data, wherein the symbol "." in the figure indicates the acquired value, and the symbol "×" indicates the calculated value.
The principles and embodiments of the present invention have been described herein with reference to specific examples, the description of which is intended only to assist in understanding the methods of the present invention and the core ideas thereof; also, it is within the scope of the present invention to be modified by those of ordinary skill in the art in light of the present teachings. In summary, the present description should not be construed as limiting the invention.
Claims (3)
1. A multisource real-time data fusion method is characterized in that various sensors add acquisition time when outputting acquisition signals and work at any timeBefore the task, time calibration and synchronization are carried out on all the sensors, one of the sensors is taken as a reference and recorded as a sensor A in the working task process, the real-time data of the other sensors are calculated, and the acquisition time of the latest two frames of data received by the sensor A in real time is usedFor the condition->Respectively calculate at->Data from other sensors at the moment, finally for +.>The data of all sensors at the moment are combined to obtain fusion data meeting the time synchronization requirement, and the specific calculation method is as follows:
each frame of data output by sensor a is noted as:wherein t represents the time at which sensor A acquired the data, < >>The data value of the jth parameter at time t is represented by calculating data of other sensors using sensor B as an example, and each frame of data output by sensor B is expressed as: />Wherein t represents the time at which sensor B acquired the data,/->A data value representing the j-th parameter at time t;
the latest two frames of data acquired by the sensor A in real time are recorded as follows:and->Wherein->The following cases are calculated at +.>Sensor B data of time instant->Wherein->For the j-th parameter at->Calculated value of time:
(1) When inWhen the time sensor B collects data, namely the collected data is thatThen->I.e. +.>Wherein->;
(2) When inTime sensor B does not collect data, in +.>When one or more frames of data are acquired in the time interval, the data are acquired in +.>Interval distance->Moment +.>The acquired data is recorded asTaken at (0->Data are acquired in the interval and the distance is +.>Moment +.>The data collected, recorded as +.>According to interpolation method, calculate +.>Data of time of day->The calculation method is as follows:
wherein->;
(3) When inTime sensor B does not collect data, in +.>When no data is collected in the time interval, the data is taken at (0 + ->Data are acquired in the interval and the distance is +.>Moment +.>The acquired data is recorded as、/>、Wherein->Then at +.>Time of day dataThe calculation steps of (a) are as follows:
step 1, according to the following stepsAcquisition data +.>Interpolation is adopted to calculate the values respectively at +.>Data of time of day->And at->Time of day dataThe calculation method is as follows:
wherein->;
Step 2, calculating inTime acquisition data +.>And calculation data +.>Deviation value betweenThe calculation method is as follows:
wherein->;
Step 3, calculating inDeviation of time of day->The calculation method is as follows:
wherein->;
Step 4, calculating inSensor B data of time instant->The calculation method is as follows:
wherein->;
Calculated by the methodData from other sensors at the moment.
2. The method of claim 1, wherein the acquisition frequencies of the respective sensors are not identical.
3. The method for multi-source real-time data fusion according to claim 1, wherein, in the following stepsTime of day, calculateAfter all the sensor data are completed, all the data are combined to obtain fusion data meeting the requirements of time synchronization and real-time performance, and the data combination mode is as follows:
。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311291695.9A CN117034201A (en) | 2023-10-08 | 2023-10-08 | Multi-source real-time data fusion method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311291695.9A CN117034201A (en) | 2023-10-08 | 2023-10-08 | Multi-source real-time data fusion method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117034201A true CN117034201A (en) | 2023-11-10 |
Family
ID=88645238
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311291695.9A Pending CN117034201A (en) | 2023-10-08 | 2023-10-08 | Multi-source real-time data fusion method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117034201A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117708514A (en) * | 2024-02-06 | 2024-03-15 | 东营航空产业技术研究院 | Data processing method based on multiple sensors |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103105611A (en) * | 2013-01-16 | 2013-05-15 | 广东工业大学 | Intelligent information fusion method of distributed multi-sensor |
WO2020253260A1 (en) * | 2019-06-21 | 2020-12-24 | 上海商汤临港智能科技有限公司 | Time synchronization processing method, electronic apparatus, and storage medium |
US20210099711A1 (en) * | 2019-09-27 | 2021-04-01 | Apple Inc. | Dynamic Point Cloud Compression Using Inter-Prediction |
CN113918652A (en) * | 2021-10-19 | 2022-01-11 | 广州极飞科技股份有限公司 | Data synchronization method and device and readable storage medium |
CN114001841A (en) * | 2021-12-06 | 2022-02-01 | 哈尔滨理工大学 | Temperature measurement method for photovoltaic module |
CN115577320A (en) * | 2022-10-15 | 2023-01-06 | 北京工业大学 | Multi-sensor asynchronous data fusion method based on data interpolation |
CN116340736A (en) * | 2022-12-26 | 2023-06-27 | 湖南华诺星空电子技术有限公司 | Heterogeneous sensor information fusion method and device |
CN116702064A (en) * | 2023-05-26 | 2023-09-05 | 云南电网有限责任公司电力科学研究院 | Method, system, storage medium and equipment for estimating operation behavior of electric power tool |
CN116720144A (en) * | 2023-06-21 | 2023-09-08 | 大连理工大学 | Free piston linear motor operation mode identification method based on data fusion and feature extraction |
CN116822743A (en) * | 2023-07-05 | 2023-09-29 | 淮阴工学院 | Wind power prediction method based on two-stage decomposition reconstruction and error correction |
-
2023
- 2023-10-08 CN CN202311291695.9A patent/CN117034201A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103105611A (en) * | 2013-01-16 | 2013-05-15 | 广东工业大学 | Intelligent information fusion method of distributed multi-sensor |
WO2020253260A1 (en) * | 2019-06-21 | 2020-12-24 | 上海商汤临港智能科技有限公司 | Time synchronization processing method, electronic apparatus, and storage medium |
US20210099711A1 (en) * | 2019-09-27 | 2021-04-01 | Apple Inc. | Dynamic Point Cloud Compression Using Inter-Prediction |
CN113918652A (en) * | 2021-10-19 | 2022-01-11 | 广州极飞科技股份有限公司 | Data synchronization method and device and readable storage medium |
CN114001841A (en) * | 2021-12-06 | 2022-02-01 | 哈尔滨理工大学 | Temperature measurement method for photovoltaic module |
CN115577320A (en) * | 2022-10-15 | 2023-01-06 | 北京工业大学 | Multi-sensor asynchronous data fusion method based on data interpolation |
CN116340736A (en) * | 2022-12-26 | 2023-06-27 | 湖南华诺星空电子技术有限公司 | Heterogeneous sensor information fusion method and device |
CN116702064A (en) * | 2023-05-26 | 2023-09-05 | 云南电网有限责任公司电力科学研究院 | Method, system, storage medium and equipment for estimating operation behavior of electric power tool |
CN116720144A (en) * | 2023-06-21 | 2023-09-08 | 大连理工大学 | Free piston linear motor operation mode identification method based on data fusion and feature extraction |
CN116822743A (en) * | 2023-07-05 | 2023-09-29 | 淮阴工学院 | Wind power prediction method based on two-stage decomposition reconstruction and error correction |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117708514A (en) * | 2024-02-06 | 2024-03-15 | 东营航空产业技术研究院 | Data processing method based on multiple sensors |
CN117708514B (en) * | 2024-02-06 | 2024-04-09 | 东营航空产业技术研究院 | Data processing method based on multiple sensors |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN117034201A (en) | Multi-source real-time data fusion method | |
WO2021027579A1 (en) | Fault diagnosis method under the effect of center frequency convergence trend | |
WO2017161963A1 (en) | Method and device for detecting optical fiber state | |
CN111143438B (en) | Workshop field data real-time monitoring and anomaly detection method based on stream processing | |
CN104390657A (en) | Generator set operating parameter measuring sensor fault diagnosis method and system | |
CN104700200A (en) | Multivariate product quality monitoring method oriented to digital workshop | |
CN104864985A (en) | Train axle temperature sensor fault detection method and apparatus | |
CN115186883A (en) | Industrial equipment health state monitoring system and method based on Bian Yun collaborative computing | |
CN104921736A (en) | Continuous blood glucose monitoring device comprising parameter estimation function filtering module | |
US20230014095A1 (en) | Method and system for recognizing environmental protection equipment based on deep hierarchical fuzzy algorithm | |
CN104217112A (en) | Multi-type signal-based power system low-frequency oscillation analysis method | |
CN111904079A (en) | Worker health monitoring method based on intelligent safety helmet | |
CN114611745A (en) | Machine tool evaluation method, machine tool evaluation system, and medium | |
CN112268719A (en) | Remote fault diagnosis method for header of combine harvester | |
CN108222083B (en) | Intelligent foundation pit monitoring system | |
CN117517877A (en) | Distributed traveling wave online measurement fault positioning system | |
WO2023200597A8 (en) | Automated positive train control event data extraction and analysis engine for performing root cause analysis of unstructured data | |
CN116993122A (en) | Carbon monitoring and command system based on Internet of things | |
CN111741084A (en) | Crop growth monitoring and diagnosing system based on wireless sensor network | |
CN203479915U (en) | Noise and electromagnetic field synchronous detection system of electric power equipment based on Internet of things | |
CN116227125A (en) | Planetary gear box management system based on digital twin and modeling method | |
CN202075609U (en) | Remote terminal with safe structure | |
CN103543353A (en) | Direct-current sampling method in wide-temperature environment | |
CN115406522A (en) | Power plant equipment running state research and application based on voiceprint recognition | |
CN110162803A (en) | The method being displaced based on Kalman filtering and discrete values integral calculation indicator card |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |