CN117191075A - Visual inertial odometer system for synchronous time synchronization of multiple sensors and time synchronization method thereof - Google Patents

Visual inertial odometer system for synchronous time synchronization of multiple sensors and time synchronization method thereof Download PDF

Info

Publication number
CN117191075A
CN117191075A CN202311015704.1A CN202311015704A CN117191075A CN 117191075 A CN117191075 A CN 117191075A CN 202311015704 A CN202311015704 A CN 202311015704A CN 117191075 A CN117191075 A CN 117191075A
Authority
CN
China
Prior art keywords
sensor
frame
triggering
data
mcu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311015704.1A
Other languages
Chinese (zh)
Inventor
曾宇
郭晓东
张达
陈卓然
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Zichuan Electronic Technology Co ltd
Original Assignee
Guangzhou Zichuan Electronic Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Zichuan Electronic Technology Co ltd filed Critical Guangzhou Zichuan Electronic Technology Co ltd
Priority to CN202311015704.1A priority Critical patent/CN117191075A/en
Publication of CN117191075A publication Critical patent/CN117191075A/en
Pending legal-status Critical Current

Links

Landscapes

  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The application discloses a visual inertial odometer system for synchronous time synchronization of multiple sensors and a time synchronization method thereof, wherein the system comprises the following steps: the system comprises a computing platform, an MCU, a data bus, a communication interface and more than two sensors; the power calculation platform is connected with the MCU through a data bus, the MCU is connected with the sensor through a communication interface, and the power calculation platform is also connected with the communication interface and the sensor respectively; the power calculation platform is used for carrying out initialization parameter setting on the MCU, acquiring sensor data measured by the sensor and carrying out preset processing; the MCU is used for controlling the triggering mode of the sensor according to the initialization parameters; and the sensor is used for acquiring the sensor data. The application can reduce the synchronization time error and improve the compatibility of the sensors, and can be widely applied to the field of synchronization of multiple sensors.

Description

Visual inertial odometer system for synchronous time synchronization of multiple sensors and time synchronization method thereof
Technical Field
The application relates to the field of multi-sensor synchronous time synchronization, in particular to a visual inertial odometer system for multi-sensor synchronous time synchronization and a time synchronization method thereof.
Background
The sensor synchronization time setting method of the conventional visual inertial odometer system comprises software synchronization and hardware synchronization, wherein the software synchronization has the problems of large synchronization time error, data delay and the like, and the hardware synchronization has the problems of limited synchronization frequency, sensor type selection, access quantity and the like.
Disclosure of Invention
In view of the above, the present application provides a visual inertial odometer system for synchronous synchronization of multiple sensors and a synchronization method thereof, so as to reduce synchronization time errors and improve compatibility of the sensors.
An aspect of the present application provides a visual inertial odometer system for synchronizing pairs of multiple sensors, comprising:
the system comprises a computing platform, an MCU, a data bus, a communication interface and more than two sensors;
the power computing platform is connected with the MCU through the data bus, the MCU is connected with the sensor through the communication interface, and the power computing platform is also connected with the communication interface and the sensor respectively;
the power calculation platform is used for carrying out initialization parameter setting on the MCU, acquiring sensor data measured by the sensor and carrying out preset processing;
the MCU is used for controlling the triggering mode of the sensor according to the initialization parameter;
the sensor is used for acquiring sensor data.
Optionally, the computing platform adopts an ARM processor or a PC host.
Optionally, the data bus adopts UART;
the communication interface adopts GPIO.
Optionally, the sensor comprises a camera, a range radar, an inertial measurement unit, and a positioning sensor.
Optionally, the camera comprises a monocular camera and a multi-eye camera;
the range radars include TOF and lidar;
the inertial measurement unit includes: gyroscopes, accelerometers, and magnetometers;
the positioning sensor comprises an RTK, a GPS and a Beidou navigation positioning sensor.
In another aspect, the present application provides a method for timing a visual inertial odometer system, including:
receiving initialization parameters sent by a computing platform, wherein the initialization parameters comprise triggering modes of various sensors;
carrying out sensor frame data statistics according to the initialization parameters to obtain frame statistics data;
synchronously triggering the corresponding sensor according to the frame statistics data and the triggering mode;
and determining trigger frame information of the sensor from the frame statistics data, and transmitting the trigger frame information to the computing platform.
Optionally, the initialization parameters include an active triggering mode and a passive triggering mode of the sensor, a sensor frame interval triggering parameter, and a single piece triggering mode and a multi-condition triggering mode of the sensor;
the method for receiving the initialization parameters sent by the computing platform comprises the steps of:
and receiving an active triggering mode and a passive triggering mode of the sensor, a sensor frame interval triggering parameter, a single piece triggering mode and a multi-condition triggering mode of the sensor, which are sent by the force calculation platform.
Optionally, the performing sensor frame data statistics according to the initialization parameter to obtain frame statistics data includes:
and counting the triggered times of the sensor through MCU interruption times, comparing the triggered times with the communication ratio of the power computing platform to receive frame data, and counting the frame data of the sensor to obtain frame statistics data.
Optionally, the step of synchronously triggering the corresponding sensor according to the frame statistics and the triggering mode includes:
determining the output frame rate of each sensor according to the frame statistics;
determining a synchronous frame rate according to the output frame rate of each sensor;
and synchronously triggering the corresponding sensor in the synchronous frame rate and the triggering mode.
Optionally, the determining trigger frame information of the sensor from the frame statistics includes:
and determining a trigger frame for synchronously triggering each sensor according to the synchronous frame rate, and marking the trigger frame by a key frame to obtain trigger frame information.
According to the application, the initialization parameter configuration is carried out on the MCU through the force calculation platform, and then the MCU configures the triggering mode and the triggering frame rate of each sensor according to the initialization parameter, so that each sensor is synchronously triggered, the data acquisition of different sensors at the same time point is ensured, and the synchronous time error is greatly reduced; meanwhile, the MCU can also package the trigger frames triggered synchronously to obtain trigger frame information, and then send the trigger frame information to the power computing platform, so that the power computing platform can conveniently distinguish the trigger frames and calculate synchronous data.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is an exemplary block diagram of a visual inertial odometer system for multi-sensor synchronization of the present application;
fig. 2 is a flow chart of a timing method of a visual inertial odometer system according to an embodiment of the application.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
It should be noted that although functional block division is performed in a device diagram and a logic sequence is shown in a flowchart, in some cases, the steps shown or described may be performed in a different order than the block division in the device, or in the flowchart.
The terms first, second and the like in the description and in the claims and in the above-described figures, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the application only and is not intended to be limiting of the application.
The existing synchronous triggering of the sensor has the following problems:
1. time uncertainty: the existing sensor triggering may be affected by external factors, so that a certain uncertainty exists in the triggering time, and further, time inconsistency or time delay of data acquisition may be caused, so that accuracy and consistency of data are affected.
2. Data are not synchronized: when the existing sensor triggers, data of different sensors may be collected at different time points, and thus inconsistency between the data may be caused, so that the data is difficult to align and analyze.
3. It is difficult to achieve multi-sensor collaboration: in applications requiring multiple sensors to work together, existing sensor triggers may be difficult to achieve accurate synchronization and collaborative work of data, which in turn may lead to inconsistent data or difficult to perform efficient data fusion and analysis.
In view of this, the application adopts MCU to combine software and hardware to synchronize each sensor, can solve the Time Delay (TD, time Delay) problem that the synchronous process produces effectively, can insert multiple sensing and carry on the data synchronization, support passive and initiative and synchronize the triggering signal.
Next, a detailed description will be given of a visual inertial measurement unit of a multi-sensor synchronization pair according to the present application, and referring specifically to fig. 1, fig. 1 shows an exemplary structure diagram of a visual inertial measurement unit of a multi-sensor synchronization pair, where the visual inertial measurement unit includes: the system comprises a computing platform, an MCU, a data bus, a communication interface and more than two sensors;
the power computing platform is connected with the MCU through the data bus, the MCU is connected with the sensor through the communication interface, and the power computing platform is also connected with the communication interface and the sensor respectively;
the power calculation platform is used for carrying out initialization parameter setting on the MCU, acquiring sensor data measured by the sensor and carrying out preset processing;
the MCU is used for controlling the triggering mode of the sensor according to the initialization parameter;
the sensor is used for acquiring sensor data.
Further, the power computing platform adopts an ARM processor or a PC host.
Further, the data bus adopts UART; the communication interface adopts GPIO.
Further, the sensor includes a camera, a range radar, an inertial measurement unit, and a positioning sensor.
Still further, the camera includes a monocular camera and a multi-view camera; the range radars include TOF and lidar; the inertial measurement unit (IMU, inertial Measurement Unit) comprises: gyroscopes, accelerometers, and magnetometers; the positioning sensor comprises an RTK sensor, a GPS sensor and a Beidou navigation positioning sensor. Wherein, RTK refers to Real-time kinematic, real-time dynamic carrier phase difference technique.
It should be noted that, the visual inertial odometer system of the embodiment of the present application may also be connected to other optional sensors, which may be selected according to actual situations, and the above-mentioned sensors are only for illustration, and are not actually limited to the sensors of the embodiment of the present application.
Specifically, the following beneficial effects can be achieved through IMU management triggering:
1. high precision time synchronization: IMUs typically have a relatively high sampling frequency and can provide microsecond time synchronization. This means that more accurate data acquisition and time alignment can be achieved by IMU synchronization triggering.
2. Data consistency: the IMU may provide synchronized triggering of multiple sensor data, ensuring that data from different sensors is collected at the same point in time. This helps to improve the consistency and accuracy of the data, particularly in applications requiring multiple sensors to work in concert.
3. Reducing sampling deviation: sampling deviation among different sensors can be reduced through IMU synchronous triggering. Sampling bias between sensors may cause data inconsistencies or delays in time, and the use of IMU for synchronous triggering may solve these problems to some extent.
In summary, the IMU synchronous triggering can provide higher time precision and data consistency, which is helpful to solve the problems of time uncertainty and data asynchronism of the common sensor triggering. This makes IMU synchronous triggering an important advantage in applications requiring high precision data acquisition and multi-sensor co-operation.
Specifically, a description will be given of a visual odometer system in a specific example.
Synchronous triggering plays a key role in a visual inertial odometer system, and TD generated in input data directly influences the accuracy of an output result. The synchronous triggering method and the synchronous triggering device can effectively solve the problems of overlarge TD and inconsistency.
The ARM processor/PC host can be used as a power computing platform and can be used for computing data transmitted by the sensor and processing communication data, such as sensor parameter setting and MCU trigger parameter setting. The MCU can be used for controlling the whole trigger logic to perform low data volume calculation, such as frame statistics and GPIO state acquisition control, so that the real-time requirement is high. The MCU hardware synchronous trigger can be connected with various sensors, such as a single-binocular camera, a TOF, a laser radar, a multi-axis IMU, RTK, GPS and the like, and can also be connected with other optional sensors.
The working principle of the visual inertial odometer system according to the embodiment of the application is described below: the ARM processor/PC host can firstly perform initialization parameter setting to the MCU through the UART, and the process of the initialization parameter setting can comprise setting an active and passive triggering mode and key frame quantity setting of each sensor. For example, when the RTK is currently used as an active trigger condition, the trigger is 10ms, the MCU performs passive triggering of other sensors according to the timestamp received by the RTK satellite as an interrupt trigger condition, the ARM processor/PC host machine can receive the RTK data packed by the MCU (the packing is adding key frame marking information) in the triggering and non-triggering processes, and the data in the triggering process can mark the key frame information and serve as a time stamp, namely, the key frame is the synchronous trigger frame data.
The MCU interrupt IO trigger can support a passive mode and an active mode, can realize the same-time triggering of multiple sensors, achieves the same-time output of data to the ARM processor/PC host, realizes the communication with the ARM processor/PC host through the UART, and the ARM processor/PC host can configure synchronous parameters for the MCU through the UART and acquire the current sensor state information from the MCU.
Distinction between active and passive modes: active triggering refers to only outputting a trigger signal (including a level signal and a soft trigger model), and passive triggering refers to only receiving the trigger signal.
Then, for the time setting method of the visual inertial odometer system provided by the application, which is applied to the MCU in the visual inertial odometer system, referring to fig. 2, the time setting method may include S200 to S230, specifically as follows:
s200: and receiving initialization parameters sent by the computing platform, wherein the initialization parameters comprise triggering modes of all the sensors.
Further, the initialization parameters comprise an active triggering mode and a passive triggering mode of the sensor, a sensor frame interval triggering parameter, a single piece triggering mode and a multi-condition triggering mode of the sensor;
the method for receiving the initialization parameters sent by the computing platform comprises the steps of:
and receiving an active triggering mode and a passive triggering mode of the sensor, a sensor frame interval triggering parameter, a single piece triggering mode and a multi-condition triggering mode of the sensor, which are sent by the force calculation platform.
S210: and carrying out sensor frame data statistics according to the initialization parameters to obtain frame statistics data.
Further, S210 may include: and counting the triggered times of the sensor through MCU interruption times, comparing the triggered times with the communication ratio of the power computing platform to receive frame data, and counting the frame data of the sensor to obtain frame statistics data.
S220: and synchronously triggering the corresponding sensor according to the frame statistics data and the triggering mode.
Further, S220 may include:
determining the output frame rate of each sensor according to the frame statistics;
determining a synchronous frame rate according to the output frame rate of each sensor;
and synchronously triggering the corresponding sensor in the synchronous frame rate and the triggering mode.
S230: and determining trigger frame information of the sensor from the frame statistics data, and transmitting the trigger frame information to the computing platform.
Further, the determining trigger frame information of the sensor from the frame statistics may include:
and determining a trigger frame for synchronously triggering each sensor according to the synchronous frame rate, and marking the trigger frame by a key frame to obtain trigger frame information.
Next, the application procedure of the time setting method of the present application will be described with specific examples.
The power computing platform initializes the synchronous parameters of the MCU through the UART, the MCU starts working according to the synchronous parameters, triggers each sensor according to frame statistics and triggering conditions, marks key frame and non-key frame information, then sends the key frame and the non-key frame information to the power computing platform through the UART, and the sensors send the acquired data to the power computing platform when triggering.
The method specifically comprises the following steps:
step 1: the power computing platform carries out initialization parameter configuration to the MCU through the UART, and can specifically comprise an initialization synchronization mode and synchronization parameters, such as configuration of an active/passive trigger mode, configuration of sensor frame interval trigger and configuration of single-condition/multi-condition trigger.
Step 2: the MCU performs sensor frame data statistics according to the initialization parameters, and specifically may include: the MCU counts the triggered times through self interruption and communicates with the power computing platform, and then compares the received data frames, so that the sensor frame data statistics is realized.
Step 3: the MCU triggers the corresponding sensor in an active/passive triggering mode according to the counted frame data and the triggering condition.
Step 4: the MCU sends the current trigger frame information to the power computing platform through the UART.
Step 5: and (3) the sensor collects data according to the synchronous trigger signal generated in the step (3) and outputs the data to the power computing platform.
Still further, embodiments of the present application may also provide another more detailed example of application for illustration.
In the process of using a visual inertial odometer system, an optimal TD value is usually calculated through estimation optimization, and then data synchronization alignment is carried out, but the method generates errors in each estimation due to various uncertainty problems. Specific working examples: taking a binocular inertial vision odometer system as an example, the sensor is generally composed of two cameras and an IMU, and the interfaces are described as follows:
the camera is an image output interface: MIPI-CSI; triggering an interface: GPIO; parameter setting interface: I2C; IMU data interface: I2C; triggering an interface: GPIO. According to the embodiment of the application, an MCU is added to perform frame calculation and trigger management of a specific data link, the trigger GPIO of a camera and the I2C/trigger GPIO of an IMU are respectively connected to the MCU, the power computing platform is respectively connected with the UART of the MCU and the MIPI-CSI of the camera, optionally, the MCU is assumed to be actively triggered by the IMU, the camera is passively triggered, the output frame rate of the IMU is 480hz, and the maximum output frame rate of the camera is 120 hz.
In some alternative embodiments, the functions/acts noted in the block diagrams may occur out of the order noted in the operational illustrations. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Furthermore, the embodiments presented and described in the flowcharts of the present application are provided by way of example in order to provide a more thorough understanding of the technology. The disclosed methods are not limited to the operations and logic flows presented herein. Alternative embodiments are contemplated in which the order of various operations is changed, and in which sub-operations described as part of a larger operation are performed independently.
Furthermore, while the application is described in the context of functional modules, it should be appreciated that, unless otherwise indicated, one or more of the described functions and/or features may be integrated in a single physical device and/or software module or one or more functions and/or features may be implemented in separate physical devices or software modules. It will also be appreciated that a detailed discussion of the actual implementation of each module is not necessary to an understanding of the present application. Rather, the actual implementation of the various functional modules in the apparatus disclosed herein will be apparent to those skilled in the art from consideration of their attributes, functions and internal relationships. Accordingly, one of ordinary skill in the art can implement the application as set forth in the claims without undue experimentation. It is also to be understood that the specific concepts disclosed are merely illustrative and are not intended to be limiting upon the scope of the application, which is to be defined in the appended claims and their full scope of equivalents.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing an electronic device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). In addition, the computer readable medium may even be paper or other suitable medium on which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present application have been shown and described, it will be understood by those of ordinary skill in the art that: many changes, modifications, substitutions and variations may be made to the embodiments without departing from the spirit and principles of the application, the scope of which is defined by the claims and their equivalents.
While the preferred embodiment of the present application has been described in detail, the present application is not limited to the embodiments described above, and those skilled in the art can make various equivalent modifications or substitutions without departing from the spirit of the present application, and these equivalent modifications or substitutions are included in the scope of the present application as defined in the appended claims.

Claims (10)

1. A visual inertial odometer system for synchronizing pairs of sensors, comprising: the system comprises a computing platform, an MCU, a data bus, a communication interface and more than two sensors;
the power computing platform is connected with the MCU through the data bus, the MCU is connected with the sensor through the communication interface, and the power computing platform is also connected with the communication interface and the sensor respectively;
the power calculation platform is used for carrying out initialization parameter setting on the MCU, acquiring sensor data measured by the sensor and carrying out preset processing;
the MCU is used for controlling the triggering mode of the sensor according to the initialization parameter;
the sensor is used for acquiring sensor data.
2. The visual inertial odometer system of claim 1, wherein the computing platform is an ARM processor or a PC host.
3. The visual odometer system of claim 1, wherein the data bus is UART;
the communication interface adopts GPIO.
4. A multi-sensor synchronizing time visual odometer system according to any of claims 1-3, wherein the sensors comprise a camera, a range radar, an inertial measurement unit and a positioning sensor.
5. A multi-sensor synchrony pair visual odometer system as claimed in claim 4, wherein the cameras comprise monocular and multi-eye cameras;
the range radars include TOF and lidar;
the inertial measurement unit includes: gyroscopes, accelerometers, and magnetometers;
the positioning sensor comprises an RTK sensor, a GPS sensor and a Beidou navigation positioning sensor.
6. A method of timing a visual odometer system, comprising:
receiving initialization parameters sent by a computing platform, wherein the initialization parameters comprise triggering modes of various sensors;
carrying out sensor frame data statistics according to the initialization parameters to obtain frame statistics data;
synchronously triggering the corresponding sensor according to the frame statistics data and the triggering mode;
and determining trigger frame information of the sensor from the frame statistics data, and transmitting the trigger frame information to the computing platform.
7. The method of claim 6, wherein the initialization parameters include an active trigger mode and a passive trigger mode of the sensor, a sensor frame interval trigger parameter, and a single trigger mode and a multi-condition trigger mode of the sensor;
the method for receiving the initialization parameters sent by the computing platform comprises the steps of:
and receiving an active triggering mode and a passive triggering mode of the sensor, a sensor frame interval triggering parameter, a single piece triggering mode and a multi-condition triggering mode of the sensor, which are sent by the force calculation platform.
8. The method of timing a visual inertial odometer system of claim 6, wherein the performing sensor frame data statistics based on the initialization parameters to obtain frame statistics includes:
and counting the triggered times of the sensor through MCU interruption times, comparing the triggered times with the communication ratio of the power computing platform to receive frame data, and counting the frame data of the sensor to obtain frame statistics data.
9. The method of timing a visual odometer system of claim 6, wherein the synchronizing triggering of the corresponding sensor in accordance with the frame statistics and the triggering scheme includes:
determining the output frame rate of each sensor according to the frame statistics;
determining a synchronous frame rate according to the output frame rate of each sensor;
and synchronously triggering the corresponding sensor in the synchronous frame rate and the triggering mode.
10. The method of timing a visual odometer system of claim 9, wherein determining trigger frame information for the sensor from the frame statistics includes:
and determining a trigger frame for synchronously triggering each sensor according to the synchronous frame rate, and marking the trigger frame by a key frame to obtain trigger frame information.
CN202311015704.1A 2023-08-11 2023-08-11 Visual inertial odometer system for synchronous time synchronization of multiple sensors and time synchronization method thereof Pending CN117191075A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311015704.1A CN117191075A (en) 2023-08-11 2023-08-11 Visual inertial odometer system for synchronous time synchronization of multiple sensors and time synchronization method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311015704.1A CN117191075A (en) 2023-08-11 2023-08-11 Visual inertial odometer system for synchronous time synchronization of multiple sensors and time synchronization method thereof

Publications (1)

Publication Number Publication Date
CN117191075A true CN117191075A (en) 2023-12-08

Family

ID=88984096

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311015704.1A Pending CN117191075A (en) 2023-08-11 2023-08-11 Visual inertial odometer system for synchronous time synchronization of multiple sensors and time synchronization method thereof

Country Status (1)

Country Link
CN (1) CN117191075A (en)

Similar Documents

Publication Publication Date Title
CN111381487B (en) Multi-sensor synchronous time service system, method and device and electronic equipment
CN110567453B (en) Bionic eye multi-channel IMU and camera hardware time synchronization method and device
CN108900272B (en) Sensor data acquisition method and system and packet loss judgment method
CN110133999B (en) Time synchronization method and system based on satellite cloud laser point cloud data acquisition platform
CN110329273B (en) Method and device for synchronizing data acquired by unmanned vehicle
US20180003822A1 (en) Environmental sensing device and information acquiring method applied to environmental sensing device
CN109922260B (en) Data synchronization method and synchronization device for image sensor and inertial sensor
CN110620632B (en) Time synchronization method and device
CN113496545B (en) Data processing system, method, sensor, mobile acquisition backpack and equipment
CN111309094A (en) Synchronous board card and method for data acquisition of sensor equipment
CN111007554A (en) Data acquisition time synchronization system and method
CN111860604B (en) Data fusion method, system and computer storage medium
CN111934843A (en) Multi-sensor data synchronous acquisition method for intelligent unmanned system
CN111600670B (en) Inductive data calculation control method and time service device
CN112541527A (en) Multi-sensor synchronization method and device, electronic equipment and storage medium
CN114964175A (en) Multi-sensor data synchronous acquisition device and acquisition method
CN111405139B (en) Time synchronization method, system, visual mileage system and storage medium
US11245763B2 (en) Data processing method, computer device and storage medium
CN117191075A (en) Visual inertial odometer system for synchronous time synchronization of multiple sensors and time synchronization method thereof
CN116389945A (en) Synchronization of multiple sensor systems
CN114006672B (en) Vehicle-mounted multi-sensor data synchronous acquisition method and system
CN102901517A (en) Measurement system based on time synchronization for rocket sled test
CN110892671A (en) Aircraft, data processing system and data processing method for aircraft
CN113765611B (en) Time stamp determination method and related equipment
CA3200304C (en) Clock synchronisation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination