CN113353090A - Data synchronization system, data synchronization method, positioning system and unmanned equipment - Google Patents
Data synchronization system, data synchronization method, positioning system and unmanned equipment Download PDFInfo
- Publication number
- CN113353090A CN113353090A CN202110666331.9A CN202110666331A CN113353090A CN 113353090 A CN113353090 A CN 113353090A CN 202110666331 A CN202110666331 A CN 202110666331A CN 113353090 A CN113353090 A CN 113353090A
- Authority
- CN
- China
- Prior art keywords
- time
- data
- real
- clock
- module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims description 28
- 230000004927 fusion Effects 0.000 claims abstract description 33
- 238000007499 fusion processing Methods 0.000 claims abstract description 12
- 238000012163 sequencing technique Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 6
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0001—Details of the control system
- B60W2050/0043—Signal treatments, identification of variables or parameters, parameter estimation or state estimation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/50—Barriers
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Studio Devices (AREA)
Abstract
The embodiment of the invention relates to the technical field of data processing, and discloses a data synchronization system, which comprises the following steps: the controller is used for acquiring the real-time sensing data sent by the sensing modules and recording the acquisition time of the real-time sensing data; generating an exposure trigger signal, recording the trigger time of the exposure trigger signal, and sending the exposure trigger signal to the camera module; the acquisition time and the trigger time are recorded by taking a real-time clock as a clock source; the camera module is used for generating image data according to the exposure trigger signal and recording the generation time of the image data; the generation time is the time recorded by taking a real-time clock as a clock source; the data fusion module is used for matching the real-time sensing data and the image data according to the acquisition time, the trigger time and the generation time, and performing data fusion processing according to a matching result. The embodiment of the invention improves the accuracy of data processing.
Description
Technical Field
The embodiment of the invention relates to the technical field of data processing, in particular to a data synchronization system, a data synchronization method, a data synchronization positioning system and unmanned equipment.
Background
At present, with the development of technologies such as automatic driving and the like, in the technical field of accurately positioning and identifying the type and distance of an obstacle, more and more objects need to be positioned more accurately, more and more software schemes are adopted, a mode of fusing multiple radar sensors and vision is adopted to position the objects, the real-time performance and synchronization requirements of an algorithm module on data are higher and higher, and therefore real-time synchronization of data and image data of external equipment such as cans and the like is very important. However, the inventor finds that, in the prior art, for a sensor device, data needs to be transmitted to an SOC (System on Chip, embedded System on Chip, main processor responsible for processing image and device data fusion such as Can and radar) or a controller and then transmitted to an algorithm process, a certain delay is brought to the receiving and management of the processor, each of a plurality of devices has its own time domain, so that a certain error exists in time between the data of the sensor module and between the data of the image, and finally a certain error also exists in a result after the algorithm calculation, and the accuracy and the safety are reduced due to the existence of the error. The existing software synchronization method cannot process the synchronization relationship of data, particularly the data with similar time is easy to reverse the sequence, so that the algorithm calculates the very remote data and abnormal data.
Disclosure of Invention
In view of the foregoing problems, embodiments of the present invention provide a data synchronization system, a data synchronization method, a positioning system, and an unmanned device, so as to solve the problem in the prior art that an error exists in data synchronization.
According to an aspect of an embodiment of the present invention, a data synchronization system is provided, which includes a sensing module, a controller, a camera module, and a data fusion module;
the controller is used for acquiring the real-time sensing data sent by the sensing modules and respectively recording the acquisition time of the real-time sensing data;
the controller is also used for generating an exposure trigger signal, recording the trigger time of the exposure trigger signal and sending the exposure trigger signal to the camera module; the acquisition time and the trigger time are recorded by taking a real-time clock as a clock source;
the camera module is used for shooting according to the exposure trigger signal, generating image data and recording the generation time of the image data; the generation time is recorded by taking a real-time clock as a clock source;
and the data fusion module is used for matching the real-time sensing data and the image data according to the acquisition time, the trigger time of the exposure trigger signal and the generation time of the image data, and performing data fusion processing according to a matching result.
In an optional manner, the data synchronization system further includes a clock source module; and the clock source module is used for providing real-time clock time for the controller.
In an optional manner, the controller is further configured to acquire a real-time clock time of the clock source module, and determine the acquisition time and the trigger time of the exposure trigger signal according to the real-time clock time.
In an optional manner, the acquiring the real-time clock time of the clock source module, and determining the acquiring time and the trigger time of the exposure trigger signal according to the real-time clock time include: correcting the clock signal of the controller according to the real-time clock time to obtain a corrected clock signal; and determining the acquisition time and the trigger time of the exposure trigger signal according to the corrected real-time clock time.
In an optional mode, the sensing module comprises a radar and a bus module; the real-time sensing data comprises radar data and bus data; the matching the real-time sensing data and the image data according to the acquisition time, the trigger time of the exposure trigger signal and the generation time of the image data, and performing data fusion processing according to a matching result includes: sequencing the real-time sensing data and the exposure trigger signals according to the sequence of acquisition time and trigger time; and according to the matching of the generation time of the image data and the trigger time of the exposure trigger signal, determining the real-time sensing data corresponding to the image data.
In an optional mode, the exposure trigger signal is a PWM signal or a GPIO signal.
According to another aspect of the embodiments of the present invention, there is provided a positioning system including the data synchronization system described above.
According to another aspect of embodiments of the present invention, there is provided an unmanned aerial vehicle comprising a data synchronization system as described above or a positioning system as described above.
According to another aspect of the embodiments of the present invention, there is provided a data synchronization method, including:
the method comprises the steps that a controller obtains real-time sensing data sent by a plurality of sensing modules and respectively records the obtaining time of the real-time sensing data;
the controller generates an exposure trigger signal, records the trigger time of the exposure trigger signal and sends the exposure trigger signal to the camera module; the acquisition time and the trigger time are recorded by taking a real-time clock as a clock source;
the camera module shoots according to the exposure trigger signal to generate image data, and records the generation time of the image data; the generation time is recorded by taking a real-time clock as a clock source;
and the data fusion module matches the real-time sensing data and the image data according to the acquisition time, the triggering time of the exposure triggering signal and the generation time of the image data, and performs data fusion processing according to a matching result.
In an optional manner, the method further comprises: and the clock source module provides real-time clock time for the controller.
The embodiment of the invention adopts the same clock source, so that the received data and the data generated by triggering are both based on the same time axis; each data received by the data fusion module has accurate time, so that data matching can be performed according to the corresponding time, and data synchronization is realized.
The foregoing description is only an overview of the technical solutions of the embodiments of the present invention, and the embodiments of the present invention can be implemented according to the content of the description in order to make the technical means of the embodiments of the present invention more clearly understood, and the detailed description of the present invention is provided below in order to make the foregoing and other objects, features, and advantages of the embodiments of the present invention more clearly understandable.
Drawings
The drawings are only for purposes of illustrating embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
fig. 1 is a schematic structural diagram of a data synchronization system provided by an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a data synchronization system according to another embodiment of the present invention;
FIG. 3 is a timing diagram illustrating data and time relationships in a data synchronization system provided by an embodiment of the present invention;
fig. 4 is a diagram illustrating a relationship between data received by a data fusion module in the data synchronization system according to the embodiment of the present invention;
fig. 5 is a flowchart illustrating a data synchronization method according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein.
Fig. 1 shows a schematic structural diagram of an embodiment of the data synchronization system of the present invention, and as shown in fig. 1, the system 10 includes a sensing module 120, a controller 110, a data fusion module 130, a camera module 140, and a clock source module 150. Wherein:
the controller 110 is configured to acquire the real-time sensing data sent by the plurality of sensing modules 120, and record acquisition times of the real-time sensing data respectively.
The controller 110 is further configured to generate an exposure trigger signal, record a trigger time of the exposure trigger signal, and send the exposure trigger signal to the camera module 140; the acquisition time and the trigger time are recorded by taking a real-time clock as a clock source.
The camera module 140 is configured to perform shooting according to the exposure trigger signal, generate image data, and record generation time of the image data; the generation time is the time recorded by taking a real-time clock as a clock source.
And the data fusion module 130 is configured to match the real-time sensing data and the image data according to the acquisition time, the trigger time of the exposure trigger signal, and the generation time of the image data, and perform data fusion processing according to a matching result.
Specifically, the working process of the system in the embodiment of the present invention is as follows: the controller 110 receives the real-time sensing data sent by each sensing module 140, and records the time for acquiring the real-time sensing data. Wherein the acquisition time of the real-time sensing data can be recorded in the form of a time stamp. When the image capturing module 140 is required to capture an image, the controller 110 sends an exposure trigger signal to the image capturing module 140, and records the trigger time for sending the exposure trigger signal. The image pickup module 140 performs image pickup based on the exposure trigger signal to generate image data, and records the time at which the image data is generated. The data fusion module 130 receives the real-time sensing data, the image data, the acquiring time, the triggering time, and the generating time, analyzes the real-time sensing data and the image data according to the acquiring time, the triggering time, and the generating time of the image data, determines a time sequence corresponding to each data, determines a relationship between the real-time sensing data and the image data, obtains a matching result, and fuses the real-time sensing data and the image data according to the matching result. Fig. 3 is a timing chart showing data and time relationships in the data synchronization system according to the embodiment of the present invention. As can be seen from the figure, in the existing mode, the sensing module sends DATA1-DATA10 to the MCU, and after the DATA3 is sent, the MCU sends a PWM1 signal to the camera module; the camera module transmits the CamerData to the SOC, the MCU transmits the trigger time of the PWM1 to the SOC after transmitting the PWM1 signal, and the camera module transmits the CamerData to the SOC at a time earlier than the time when the MCU transmits the trigger time to the SOC. However, in practice, it is necessary to generate the trigger time before generating the CamerData. After sending DATA6, the trigger time for generating CamerData overlaps with PWM 1. After sending DATA6, the time at which CamerData is sent to the SOC is earlier than the time at which the MCU sends the trigger time to the SOC. Thereby making the time inconsistent with the real time. Wherein, the generation time of the multiple sensing data, the exposure trigger signal and the image data can be based on the uniform time stamp. The method for matching the real-time sensing data and the image data according to the acquisition time, the trigger time of the exposure trigger signal and the generation time of the image data and performing data fusion processing according to a matching result includes: sequencing the real-time sensing data and the exposure trigger signals according to the sequence of acquisition time and trigger time; and according to the matching of the generation time of the image data and the trigger time of the exposure trigger signal, determining the real-time sensing data corresponding to the image data.
The sensing module 120 includes a radar module, a CAN module, and other sensing modules.
In the embodiment of the present invention, the present invention further includes a clock source module 150, configured to provide a real-time clock time for the controller 110. The clock signal inside the controller 110 can be corrected by the clock source module 150, so that the time for recording data is more accurate. The clock source module 150 may be a GPS module, and the time recorded by the camera module 140 also adopts a GPS clock source signal. The controller 110 is further configured to obtain a real-time clock time of the clock source module, and determine the obtaining time and a trigger time of the exposure trigger signal according to the real-time clock time. The acquiring the real-time clock time of the clock source module and determining the acquiring time and the triggering time of the exposure triggering signal according to the real-time clock time comprise: correcting the clock signal of the controller according to the real-time clock time to obtain a corrected clock signal; and determining the acquisition time and the trigger time of the exposure trigger signal according to the corrected real-time clock time.
The embodiment of the invention adopts the same clock source, so that the received data and the data generated by triggering are both based on the same time axis; each data received by the data fusion module has accurate time, so that data matching can be performed according to the corresponding time, and data synchronization is realized.
Referring to fig. 2, fig. 2 is a schematic structural diagram illustrating a data synchronization system according to another embodiment of the present invention. The system comprises at least two controllers, a plurality of sensing modules and a plurality of camera modules. As shown in fig. 2, specifically, the method includes: the system comprises a data fusion module 130, a clock source module 150, a first controller 1101, a second controller 1102, a first sensing module 1201, a second sensing module 1202, a first camera module 1401 and a second camera module 1402. The first sensing module 1201 is connected to the data fusion module 130 through the first controller 1101, and the second sensing module 1202 is connected to the data fusion module 130 through the second controller 1102. The first controller 1101 is connected to the first image capture module 1401, and the first image capture module 1401 is connected to the data fusion module 130. The second controller 1102 is connected to the second camera module 1402, and the second camera module 1402 is connected to the data fusion module 130. The clock source module 150 is respectively connected to the first controller 1101 and the second controller 1102. The first sensing module 1201 includes a plurality of sensing modules, which may be of the same or different types, installed at different locations on the data synchronization system. The first sensing module 1201 includes a CAN module, a radar module, and other sensing modules. The CAN module is configured to send bus data to the first controller 1101; the radar module is configured to send laser detection data to the first controller 1101, and the other sensing modules may also be a temperature sensing module, an infrared module, a humidity sensing module, and the like. The data fusion module 130 may be an SOC (System on Chip, embedded System on Chip, main processor responsible for processing data fusion of image and device such as Can and radar). In the embodiment of the present invention, the controller 110 sends the data to the data fusion module 130 through transmission modes such as SPI/NET.
Specifically, the working process of the system in the embodiment of the present invention is as follows: the first controller 1101 and the second controller 1102 respectively receive the real-time clock time sent by the clock source module 150, and the first controller 1101 and the second controller 1102 correct the internal clock signal according to the real-time clock time, so that the clock signals of the first controller 1101 and the second controller 1102 are synchronized. The first controller 1101 receives a plurality of first sensing data sent by each of the first sensing modules 1201 in real time, and records the acquisition time of the received first sensing data according to the corrected clock signal, wherein a timestamp may be added to the first sensing data according to the corrected clock signal. Similarly, the second controller 1102 receives a plurality of second sensing data sent by each of the second sensing modules 1202 in real time, and records the acquisition time of the received second sensing data according to the corrected clock signal, wherein the time stamp identifier may be added to the tactile data according to the corrected clock signal. When image data needs to be acquired, the first controller 1101 sends a first exposure trigger signal to the first image pickup module 1401, and the first controller 1101 adds a timestamp identifier to the first exposure trigger signal so as to record the trigger time of the first exposure trigger signal; and/or the second controller 1102 sends a second exposure trigger signal to the second camera module 1402, and the second controller 1102 adds a timestamp identifier to the second exposure trigger signal to record the trigger time of the second exposure trigger signal. The first controller 1101 transmits the time-stamped first sensing data and the trigger time of the first exposure trigger signal to the data fusion module 130, and the second controller 1102 transmits the time-stamped second sensing data and the trigger time of the second exposure trigger signal to the data fusion module 130. Wherein, the exposure trigger signal can be a PWM signal or a GPIO signal.
The first image pickup module 1401 receives a first exposure trigger signal, and performs image pickup based on the first exposure trigger signal to generate first image data. The second camera module 1402 receives the second exposure trigger signal, and performs shooting according to the second exposure trigger signal to obtain second image data. The first image capturing module 1401 adds a time stamp to the first image data after the first image data is generated, to record the generation time of the first image data. The second camera module 1402 generates second image data, and then adds a time stamp to the second image data to record the time of generation of the second image data. The first camera module 1401 transmits the first image data added with the time stamp to the data fusion module 130, and the second camera module 1402 transmits the second image data added with the time stamp to the data fusion module 130.
The data fusion module 130 receives the first sensing data with the timestamp added, the second sensing data with the timestamp added, the first exposure trigger signal with the timestamp added, the second exposure trigger signal with the timestamp added, the first image data with the timestamp added, and the second image data with the timestamp added.
The data fusion module 130 sequences the first sensing data, the second sensing data, the first exposure trigger signal, the second exposure trigger signal, the first image data, and the second image data according to the sequence of the timestamps, and determines the time sequence of each data. As shown in fig. 4, for the data stream of the sensing data and the exposure trigger signal flowing from the controller to the data fusion module, which includes data and corresponding timestamps, the data is sorted according to the timestamps of the data, and the data sorted according to the time axis is obtained: DATA1 (sensing DATA), DATA2, DATA3, PWM1, DATA4, DATA5, DATA6, PWM1, DATA7, DATA8, DATA9, PWM 3. The DATA streams of the first camera module 1401 and the second camera module 1402 received by the DATA fusion module 130 include CameraData1, CameraData2, and CameraData3, and if the generation time of the CameraData1 is determined to be closest to the trigger time of the PWM1 according to the time stamp, it is determined that the CameraData1 is located between the DATA3 and the DATA4, and accordingly, the positions of the CameraData2 and the CameraData3 in the time axis can be determined. The relationship between the data is thus determined according to the time sequence. For example, in the mode in which the controller receives the sensing DATA and then controls the image capturing module to capture the image, after the DATA order is determined, DATA1, DATA2, DATA3, and CameraData1 may be determined as a set of DATA, DATA4, DATA5, DATA6, and CameraData2 may be determined as a set of DATA, and DATA7, DATA8, DATA9, and CameraData2 may be determined as a set of DATA. Therefore, the SOC can perform fusion processing on each group of data to obtain a target result.
In the embodiment of the invention, the same clock source module is adopted to correct the clock signals of all the controllers, so that the received data and the data generated by triggering are based on the same time axis; each data received by the data fusion module has accurate time, so that data matching can be performed according to the corresponding time, and data synchronization is realized.
Fig. 5 is a flowchart illustrating a data synchronization method according to an embodiment of the present invention. The data synchronization method is based on the data synchronization system of the above embodiment. As shown in fig. 5, the method comprises the steps of:
step 210: the controller acquires real-time sensing data sent by the sensing modules and respectively records the acquisition time of the real-time sensing data.
Step 220: the controller generates an exposure trigger signal, records the trigger time of the exposure trigger signal and sends the exposure trigger signal to the camera module; the acquisition time and the trigger time are recorded by taking a real-time clock as a clock source.
Step 230: the camera module shoots according to the exposure trigger signal to generate image data, and records the generation time of the image data; the generation time is the time recorded by taking a real-time clock as a clock source.
Step 240: and the data fusion module matches the real-time sensing data and the image data according to the acquisition time, the triggering time of the exposure triggering signal and the generation time of the image data, and performs data fusion processing according to a matching result.
In an optional manner, the data synchronization system further includes a clock source module; and the clock source module is used for providing real-time clock time for the controller.
In an optional manner, the controller is further configured to acquire a real-time clock time of the clock source module, and determine the acquisition time and the trigger time of the exposure trigger signal according to the real-time clock time.
In an optional manner, the acquiring the real-time clock time of the clock source module, and determining the acquiring time and the trigger time of the exposure trigger signal according to the real-time clock time include: correcting the clock signal of the controller according to the real-time clock time to obtain a corrected clock signal; and determining the acquisition time and the trigger time of the exposure trigger signal according to the corrected real-time clock time.
In an optional mode, the sensing module comprises a radar and a bus module; the real-time sensing data comprises radar data and bus data; the matching the real-time sensing data and the image data according to the acquisition time, the trigger time of the exposure trigger signal and the generation time of the image data, and performing data fusion processing according to a matching result includes: sequencing the real-time sensing data and the exposure trigger signals according to the sequence of acquisition time and trigger time; and according to the matching of the generation time of the image data and the trigger time of the exposure trigger signal, determining the real-time sensing data corresponding to the image data.
In an optional mode, the exposure trigger signal is a PWM signal or a GPIO signal.
The embodiment of the invention adopts the same clock source, so that the received data and the data generated by triggering are both based on the same time axis; each data received by the data fusion module has accurate time, so that data matching can be performed according to the corresponding time, and data synchronization is realized.
The algorithms or displays presented herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general purpose systems may also be used with the teachings herein. The required structure for constructing such a system will be apparent from the description above. In addition, embodiments of the present invention are not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions of specific languages are provided above to disclose the best mode of the invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the embodiments of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names. The steps in the above embodiments should not be construed as limiting the order of execution unless specified otherwise.
Claims (10)
1. A data synchronization system is characterized by comprising a sensing module, a controller, a camera module and a data fusion module;
the controller is used for acquiring the real-time sensing data sent by the sensing modules and respectively recording the acquisition time of the real-time sensing data;
the controller is also used for generating an exposure trigger signal, recording the trigger time of the exposure trigger signal and sending the exposure trigger signal to the camera module; the acquisition time and the trigger time are recorded by taking a real-time clock as a clock source;
the camera module is used for shooting according to the exposure trigger signal, generating image data and recording the generation time of the image data; the generation time is recorded by taking a real-time clock as a clock source;
and the data fusion module is used for matching the real-time sensing data and the image data according to the acquisition time, the trigger time of the exposure trigger signal and the generation time of the image data, and performing data fusion processing according to a matching result.
2. The system of claim 1, wherein the data synchronization system further comprises a clock source module;
and the clock source module is used for providing real-time clock time for the controller.
3. The system according to claim 2, wherein the controller is further configured to obtain a real-time clock time of the clock source module, and determine the obtaining time and the trigger time of the exposure trigger signal according to the real-time clock time.
4. The system according to claim 3, wherein said obtaining a real-time clock time of said clock source module, and determining said obtaining time and a trigger time of said exposure trigger signal according to said real-time clock time comprises:
correcting the clock signal of the controller according to the real-time clock time to obtain a corrected clock signal;
and determining the acquisition time and the trigger time of the exposure trigger signal according to the corrected real-time clock time.
5. The system of any of claims 1-4, wherein the sensing module comprises a radar, bus module; the real-time sensing data comprises radar data and bus data;
the matching the real-time sensing data and the image data according to the acquisition time, the trigger time of the exposure trigger signal and the generation time of the image data, and performing data fusion processing according to a matching result includes:
sequencing the real-time sensing data and the exposure trigger signals according to the sequence of acquisition time and trigger time;
and according to the matching of the generation time of the image data and the trigger time of the exposure trigger signal, determining the real-time sensing data corresponding to the image data.
6. The system of claim 1, wherein the exposure trigger signal is a PWM signal or a GPIO signal.
7. A positioning system, characterized in that it comprises a data synchronization system according to any of claims 1-6.
8. An unmanned aerial device, wherein the unmanned aerial device comprises a data synchronization system according to any of claims 1-6 or a positioning system according to claim 7.
9. A method for synchronizing data, the method comprising:
the method comprises the steps that a controller obtains real-time sensing data sent by a plurality of sensing modules and respectively records the obtaining time of the real-time sensing data;
the controller generates an exposure trigger signal, records the trigger time of the exposure trigger signal and sends the exposure trigger signal to the camera module; the acquisition time and the trigger time are recorded by taking a real-time clock as a clock source;
the camera module shoots according to the exposure trigger signal to generate image data, and records the generation time of the image data; the generation time is recorded by taking a real-time clock as a clock source;
and the data fusion module matches the real-time sensing data and the image data according to the acquisition time, the triggering time of the exposure triggering signal and the generation time of the image data, and performs data fusion processing according to a matching result.
10. The method of claim 9, further comprising: and the clock source module provides real-time clock time for the controller.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110666331.9A CN113353090A (en) | 2021-06-16 | 2021-06-16 | Data synchronization system, data synchronization method, positioning system and unmanned equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110666331.9A CN113353090A (en) | 2021-06-16 | 2021-06-16 | Data synchronization system, data synchronization method, positioning system and unmanned equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113353090A true CN113353090A (en) | 2021-09-07 |
Family
ID=77534652
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110666331.9A Pending CN113353090A (en) | 2021-06-16 | 2021-06-16 | Data synchronization system, data synchronization method, positioning system and unmanned equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113353090A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113922910A (en) * | 2021-10-09 | 2022-01-11 | 广东汇天航空航天科技有限公司 | Sensor time synchronization processing method, device and system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100183034A1 (en) * | 2009-01-16 | 2010-07-22 | Microsoft Corporation | Synchronization of multi-time base data sources |
CN109983414A (en) * | 2016-12-07 | 2019-07-05 | 深圳市大疆创新科技有限公司 | System and method for supporting to synchronize in moveable platform |
CN112672415A (en) * | 2020-12-25 | 2021-04-16 | 之江实验室 | Multi-sensor time synchronization method, device, system, electronic device and medium |
CN112945228A (en) * | 2021-02-04 | 2021-06-11 | 刘成 | Multi-sensor time synchronization method and synchronization device |
-
2021
- 2021-06-16 CN CN202110666331.9A patent/CN113353090A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100183034A1 (en) * | 2009-01-16 | 2010-07-22 | Microsoft Corporation | Synchronization of multi-time base data sources |
CN109983414A (en) * | 2016-12-07 | 2019-07-05 | 深圳市大疆创新科技有限公司 | System and method for supporting to synchronize in moveable platform |
US20190324449A1 (en) * | 2016-12-07 | 2019-10-24 | SZ DJI Technology Co., Ltd. | System and method for supporting synchronization in a movable platform |
CN112672415A (en) * | 2020-12-25 | 2021-04-16 | 之江实验室 | Multi-sensor time synchronization method, device, system, electronic device and medium |
CN112945228A (en) * | 2021-02-04 | 2021-06-11 | 刘成 | Multi-sensor time synchronization method and synchronization device |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113922910A (en) * | 2021-10-09 | 2022-01-11 | 广东汇天航空航天科技有限公司 | Sensor time synchronization processing method, device and system |
CN113922910B (en) * | 2021-10-09 | 2023-09-19 | 广东汇天航空航天科技有限公司 | Sensor time synchronization processing method, device and system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108900272B (en) | Sensor data acquisition method and system and packet loss judgment method | |
CN111239790B (en) | Vehicle navigation system based on 5G network machine vision | |
CN109587405B (en) | Time synchronization method and device | |
CN102668534B (en) | Data search, parser, and synchronization of video and telemetry data | |
CN112261283B (en) | Synchronous acquisition method, device and system of high-speed camera | |
CN109905194A (en) | A kind of vehicle-mounted terminal system and synchronization data obtaining method, device | |
CN112214009B (en) | Sensor data processing method and device, electronic equipment and system | |
CN107707626B (en) | Data acquisition card, data acquisition system and data acquisition method based on FPGA | |
CN112383675B (en) | Time synchronization method and device and terminal equipment | |
US10938544B2 (en) | Method and device of data synchronization and data collection for aerial vehicle and aerial vehicle | |
CN106707736A (en) | Automobile instrument clock precision measuring method and automobile instrument clock precision measuring device | |
CN111860604A (en) | Data fusion method, system and computer storage medium | |
CN113353090A (en) | Data synchronization system, data synchronization method, positioning system and unmanned equipment | |
CN113311905B (en) | Data processing system | |
WO2020113358A1 (en) | Systems and methods for synchronizing vehicle sensors and devices | |
CN104575006A (en) | Image shooting moment determining method and system as well as vehicle speed detection method and system | |
CN111279637B (en) | Information synchronization method, unmanned aerial vehicle, load equipment, system and storage medium | |
CN111193568A (en) | Time synchronization method, device, system, storage medium and vehicle | |
JP6075377B2 (en) | COMMUNICATION DEVICE, COMMUNICATION SYSTEM, COMMUNICATION METHOD, AND PROGRAM | |
CN113129382A (en) | Method and device for determining coordinate conversion parameters | |
US11656328B2 (en) | Validating object detection hardware and algorithms | |
CN112995524A (en) | High-precision acquisition vehicle, and photo exposure information generation system, method and synchronization device thereof | |
CN113992469B (en) | Data fusion method and device, electronic equipment and computer readable medium | |
CN111464772B (en) | Method and device for setting time stamp on recorded video and electronic equipment for vehicle | |
CN116744218A (en) | Multi-sensor synchronous positioning method, device, system and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information |
Address after: 518000 401, Building B1, Nanshan Zhiyuan, No. 1001, Xueyuan Avenue, Changyuan Community, Taoyuan Street, Nanshan District, Shenzhen, Guangdong Applicant after: Shenzhen Saifang Technology Co.,Ltd. Address before: 518000 room 701, building B1, Nanshan wisdom garden, 1001 Xueyuan Avenue, Changyuan community, Taoyuan Street, Nanshan District, Shenzhen City, Guangdong Province Applicant before: Shenzhen Daotong Intelligent Automobile Co.,Ltd. |
|
CB02 | Change of applicant information | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210907 |
|
RJ01 | Rejection of invention patent application after publication |