CN111405139B - Time synchronization method, system, visual mileage system and storage medium - Google Patents

Time synchronization method, system, visual mileage system and storage medium Download PDF

Info

Publication number
CN111405139B
CN111405139B CN202010222998.5A CN202010222998A CN111405139B CN 111405139 B CN111405139 B CN 111405139B CN 202010222998 A CN202010222998 A CN 202010222998A CN 111405139 B CN111405139 B CN 111405139B
Authority
CN
China
Prior art keywords
image
imu
image acquisition
time
acquisition module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010222998.5A
Other languages
Chinese (zh)
Other versions
CN111405139A (en
Inventor
李光耀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingke Xiaomei Robot Technology Chengdu Co ltd
Original Assignee
Qingke Xiaomei Robot Technology Chengdu Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingke Xiaomei Robot Technology Chengdu Co ltd filed Critical Qingke Xiaomei Robot Technology Chengdu Co ltd
Priority to CN202010222998.5A priority Critical patent/CN111405139B/en
Publication of CN111405139A publication Critical patent/CN111405139A/en
Application granted granted Critical
Publication of CN111405139B publication Critical patent/CN111405139B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • H04J3/0638Clock or time synchronisation among nodes; Internode synchronisation
    • H04J3/0658Clock or time synchronisation among packet nodes
    • H04J3/0661Clock or time synchronisation among packet nodes using timestamps

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electric Clocks (AREA)
  • Studio Devices (AREA)

Abstract

The invention relates to a time synchronization method, a system, a visual mileage system and a storage medium, which belong to the technical field of machine vision, wherein the method comprises the steps that if the interrupt period of an inertia measurement module is T, the triggering period of an image is set as nT; the inertial measurement module at the moment of m+nT sends a trigger signal to the image acquisition module, and the image acquisition module controls the imaging time of the image after receiving the trigger signal, so that the imaging moment of the image is (m+1) nT, wherein m and n are integers greater than or equal to 1. The method flexibly controls the start and the end of image exposure, eliminates the deviation between the image timestamp and the IMU in one step, and thus improves the accuracy and the stability of the visual odometer system to the greatest extent.

Description

Time synchronization method, system, visual mileage system and storage medium
Technical Field
The invention relates to the technical field of machine vision, in particular to a time synchronization method, a system, a vision mileage system and a storage medium.
Background
Machine vision is one of the most advanced technologies at present, one of which is called a visual odometer (VIO), which is widely applied to VR/AR, unmanned and mobile robots, such as apple AR-Kit and google AR-Core, which use the VIO technology for spatial positioning.
In general, the VIO system ignores the time deviation between the IMU (inertial measurement unit) and the camera, and considers that the IMU and the camera are synchronized and aligned, however, due to the problems of trigger delay, transmission delay, inaccurate synchronization clock and the like of the hardware system, there is usually a time deviation between the IMU and the camera, and eliminating and correcting the time deviation will effectively improve the performance of the VIO system.
The visual mileage calculation method is based on the principle that the position change information of the image feature points of two adjacent frames of images is calculated to calculate the speed and displacement between the two frames, then IMU data between the two frames of images is calculated in an integral mode to calculate the motion speed and displacement track, and algorithm fusion is carried out on the two calculation results to obtain more accurate speed and track, but most cameras cannot align the image timestamp with the IMU timestamp, and only the later differential algorithm processing is used to change the IMU data to align the IMU data, so that errors are introduced.
Some existing world camera images and IMUs are two systems which are independently operated, and time stamps are marked on the images and IMU data after the data are received by an upper computer, so that the time stamps are extremely inaccurate, and excessive transmission delay errors and marking errors are introduced.
Also, although the image time stamp and the IMU time stamp are marked under the same time system by a single-binocular camera such as realsense, ZED with advanced points, the image time stamp and the IMU time stamp cannot be completely aligned or eliminated, and even the deviation of the image time stamp and the IMU time stamp under different exposure time is extremely unstable, so that a lot of uncontrollable errors are introduced to the vision odometer system. Uncertainty is brought to the long-term stable operation of the whole system.
Disclosure of Invention
The invention aims to provide a time synchronization method which has the characteristics of accurate positioning and stable system.
The first object of the present invention is achieved by the following technical solutions:
a method of time synchronization, comprising:
if the interruption period of the inertia measurement module is T, the triggering period of the image is set as nT;
the inertia measurement module at the m+1 times nT sends a trigger signal to the image acquisition module, the image acquisition module receives the trigger signal and then controls the imaging time of the image so that the imaging time of the image is (m+1) times nT,
wherein m and n are integers greater than or equal to 1.
The method flexibly controls the start and the end of image exposure, and further eliminates the deviation between the image timestamp and the IMU timestamp, thereby improving the accuracy and the stability of the visual odometer system to the greatest extent.
The present invention may be further configured in a preferred example to: and setting an image acquisition module to trigger a synchronous mode to take the middle moment of image exposure as a synchronous point.
The present invention may be further configured in a preferred example to: and (3) transmitting the raw data of the inertial measurement module, the time stamp of the inertial measurement module, the image and the time stamp of the image at the moment (n+1) T to a host.
The invention aims at providing a real-time synchronization and alignment system for an image and an IMU time stamp, which has the characteristic of time accuracy.
The second object of the present invention is achieved by the following technical solutions:
a time synchronization system comprising an inertial measurement module, an image acquisition module and a microprocessor module for implementing any of the methods described above,
the data interrupt period generated by the inertia measurement module is T;
the triggering period of the image acquisition module is nT;
the microprocessor module is respectively connected with the inertial measurement module and the image acquisition module in a communication way,
the inertia measurement module at the nT moment sends a trigger signal through the unprocessed module and the image acquisition module, and the image acquisition module receives the trigger signal and then controls the imaging time of the image so that the imaging moment of the image is (n+1) T.
The system has a simple structure, and the deviation between the image timestamp and the IMU is further eliminated by flexibly controlling the start and the end of the image exposure, so that the accuracy and the stability of the visual odometer system are improved to the greatest extent.
The present invention may be further configured in a preferred example to: the image acquisition module comprises an image acquisition unit and an ISP chip, wherein the ISP chip is set to be in a triggering mode, and the ISP chip is set to trigger a synchronous mode by taking the middle of image exposure of the image acquisition unit as a synchronous point.
The present invention may be further configured in a preferred example to: the image acquisition unit comprises at least one camera.
The present invention may be further configured in a preferred example to: the microprocessor module comprises at least one MCU which is respectively in communication connection with the inertial measurement module and the image acquisition module.
The present invention may be further configured in a preferred example to: the microprocessor module comprises a first MCU and a second MCU, wherein the first MCU is in communication connection with the inertia measurement module and the image acquisition module, and is used for acquiring an image time stamp, an IMU time stamp and IMU original data and transmitting a trigger signal;
the second MCU is in communication connection with the image acquisition module and the second MUC and is used for acquiring an image and receiving an image time stamp, an IMU time stamp and IMU original data of the first MCU.
The invention aims at providing a visual odometer system which has the characteristic of stable system.
The third object of the present invention is achieved by the following technical solutions:
a vision mileage system comprising a memory and a processor, the memory having stored thereon a computer program capable of being loaded by the processor and performing the method of synchronizing and aligning the image with an IMU time stamp in real time.
The fourth object of the present invention is to provide a computer storage medium capable of storing a corresponding program, which has the characteristic of facilitating real-time synchronization of an image and an IMU time stamp.
The fourth object of the present invention is achieved by the following technical solutions:
a computer readable storage medium storing a computer program capable of being loaded by a processor and performing a method of real-time synchronization and alignment of any of the images described above with IMU time stamps.
In summary, the present invention includes at least one of the following beneficial technical effects: the invention further eliminates the deviation between the image time stamp and the IMU time stamp by flexibly controlling the start and the end of the image exposure, thereby improving the precision and the stability of the visual odometer system to the greatest extent
Drawings
Fig. 1 is a system block diagram of a time synchronization system in accordance with one embodiment of the present invention.
Fig. 2 is a flowchart of a time synchronization method according to an embodiment of the invention.
FIG. 3 is a timing diagram of an image and IMU in accordance with one embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings.
The present embodiment is only for explanation of the present invention and is not to be construed as limiting the present invention, and modifications to the present embodiment, which may not creatively contribute to the present invention as required by those skilled in the art after reading the present specification, are all protected by patent laws within the scope of claims of the present invention.
In addition, the term "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In this context, unless otherwise specified, the term "/" generally indicates that the associated object is an "or" relationship.
Example 1: the embodiment of the invention provides a time synchronization system, which is shown in fig. 1 and comprises an inertial measurement module, an image acquisition module and a microprocessor module.
The inertial measurement module comprises an IMU chip, and the IMU chip can adopt chips such as ICM20602, BMI088 and the like and is used for three-axis attitude angles (or angular rates) and accelerations of an object.
The image acquisition module comprises an image acquisition unit and an ISP chip, wherein the image acquisition unit can adopt one camera or more than two cameras, and in the embodiment, two cameras are preferably adopted. The camera chip can adopt AR0144, AR0135 and other chips for acquiring images in real time, and can adopt cameras with higher frame rate if the requirements on the images are higher.
The ISP chip can adopt an AP1302 and the like, is in communication connection with the two cameras and is used for controlling the starting and ending time of the exposure of the two cameras and determining the middle time of the image.
The microprocessor module comprises a first MCU and a second MCU, the first MCU preferably adopts a common MCU, chips such as STM32, GD32 and the like can be adopted, and the first MCU is respectively in communication connection with the ISP chip and the IMU chip. The second MCU adopts an image MCU, the image MCU can adopt CYPRESS CY3014 or CY3065, and the second MCU is respectively in communication connection with the ISP chip and the first MCU.
The second MCU gathers IMU original data, IMU time stamp, image and image time stamp and sends to the host computer with vision mileage calculation method through USB to calculate.
In the embodiment of the invention, i2c, uart, spi and the like can be used for communication between chips. The first MCU may be eliminated if a higher image frame rate is not required and the second MCU may be used to replace the operation of the first MCU. The data transmission mode between the second MCU and the host can also be replaced by a network port by USB.
Example 2: the embodiment of the invention provides a time synchronization method, which is mainly based on the time synchronization system, and the main flow of the method is described as follows.
As shown in fig. 2:
s100, initializing an IMU chip by the first MCU, setting the IMU chip to be in an interrupt mode, and interrupting once every time a group of data is generated. And setting the interrupt period of the IMU chip as T, wherein the image trigger period is nT, and n is an integer. Wherein n is an integer greater than or equal to 1. If IMU frequency 200HZ, n=4, then t=1/200, nt=1/50, and the image frame rate is 50HZ.
S200, initializing an ISP chip by the first MCU, setting the ISP chip as a trigger mode, and setting the trigger synchronous mode of the ISP chip to take the middle moment of image exposure as a synchronous point.
And S300, periodically generating data interruption by the IMU chip, reading IMU data after the first MCU receives the IMU data interruption signal, and sending a GPIO trigger signal to the ISP chip after the IMU data interruption at the m-nT moment, wherein m is an integer greater than or equal to 1.
S400, the first MCU sends a GPIO trigger signal to the ISP chip, the ISP chip immediately controls the next image exposure after receiving the trigger signal, and the time for starting the exposure and ending the exposure is automatically calculated, so that the IMU chip interrupt of the middle moment of the image exposure and the moment of (m+1) nT corresponds to each other. The MCU will take the time at which IMU at time (m+1) nT triggered the interrupt as the time stamp common to the image and IMU at that moment.
And S500, the ISP chip starts exposure according to the control of the first MCU, acquires the middle moment of image exposure, receives a data interrupt of (m+1) nT moment at the middle moment of the first MCU receiving the image exposure, wherein the data interrupt corresponds to the middle moment of the image exposure, and takes the time of the IMU chip triggering interrupt at the moment as the common time stamp of the exposed image and the IMU at the moment.
And S600, the first MCU sends the IMU original data, the IMU time stamp of the previous period and the image time stamp to the second MCU before the exposure is finished.
S700, the first MCU starts to receive the image at the time (m+1) nT.
And S800, the second MCU transmits the received IMU time stamp, the image time stamp and the IMU data to a host computer with a visual mileage calculation method through a USB or a network port for calculation.
In fig. 3, after the system is initially completed, the IMU chip sends two complete nT cycle trigger signals to the ISP chip, the first two nT cycles will not trigger the image acquisition, and in the third nT cycle, the image acquisition is started.
Example 3: a visual odometer system comprising a memory and a processor, the memory having stored thereon a computer program capable of being loaded by the processor and performing the method of embodiment 2.
Example 4: a computer readable storage medium storing a computer program capable of being loaded by a processor and executing the method of embodiment 2.

Claims (9)

1. A method of time synchronization, comprising:
if the interruption period of the inertia measurement module is T, the triggering period of the image is set as nT;
the inertia measurement module at the m+1 times nT sends a trigger signal to the image acquisition module, the image acquisition module receives the trigger signal and then controls the imaging time of the image so that the imaging time of the image is (m+1) times nT,
wherein m and n are integers greater than or equal to 1;
the inertial measurement module comprises an IMU chip; the IMU chip is set to be in an interrupt mode, and is interrupted once when a group of data is generated;
the image acquisition module comprises an ISP chip and a camera, wherein the ISP chip controls the time of starting and ending the exposure of the camera, and determines the middle moment of an image;
after receiving the data interrupt signal of the IMU chip, sending a trigger signal to the ISP chip at the m times nT;
setting an image acquisition module to trigger a synchronous mode to take the middle moment of image exposure as a synchronous point;
the middle moment of image exposure corresponds to the IMU chip interruption at the (m+1) nT moment, and the time of the IMU triggering interruption at the (m+1) nT moment is taken as a common time stamp of the image and the IMU at the moment.
2. A time synchronization method according to claim 1, characterized in that the inertial measurement module raw data, the inertial measurement module time stamp, the image and the image time stamp at time (n+1) T are transmitted to the host computer.
3. A time synchronization system comprising an inertial measurement module, an image acquisition module and a microprocessor module for implementing the method of any one of claims 1-2,
the data interrupt period generated by the inertia measurement module is T;
the triggering period of the image acquisition module is nT;
the microprocessor module is respectively connected with the inertial measurement module and the image acquisition module in a communication way,
the inertial measurement module at the moment of m+1) nT sends a trigger signal to the image acquisition module, and the image acquisition module controls the imaging time of the image after receiving the trigger signal, so that the imaging moment of the image is (m+1) nT.
4. A time synchronizing system according to claim 3, wherein the image acquisition module comprises an image acquisition unit and an ISP chip, the ISP chip being set to a trigger mode and the ISP chip being set to trigger the synchronizing mode with the image exposure middle of the image acquisition unit as a synchronizing point.
5. The system of claim 4, wherein the image acquisition unit comprises at least one camera.
6. A time synchronization system according to claim 3, wherein the microprocessor module comprises at least one MCU communicatively connected to the inertial measurement module and the image acquisition module, respectively.
7. A time synchronization system according to claim 3, wherein the microprocessor module comprises a first MCU and a second MCU, the first MCU is communicatively connected to the inertial measurement module and the image acquisition module, and is configured to acquire an image time stamp, an IMU time stamp, and IMU raw data, and transmit a trigger signal;
the second MCU is in communication connection with the image acquisition module and the second MUC and is used for acquiring an image and receiving an image time stamp, an IMU time stamp and IMU original data of the first MCU.
8. A visual odometer system, comprising a memory and a processor, the memory having stored thereon a computer program capable of being loaded by the processor and performing the method according to any of claims 1 to 2.
9. A computer readable storage medium, characterized in that a computer program is stored which can be loaded by a processor and which performs the method according to any of claims 1 to 2.
CN202010222998.5A 2020-03-26 2020-03-26 Time synchronization method, system, visual mileage system and storage medium Active CN111405139B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010222998.5A CN111405139B (en) 2020-03-26 2020-03-26 Time synchronization method, system, visual mileage system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010222998.5A CN111405139B (en) 2020-03-26 2020-03-26 Time synchronization method, system, visual mileage system and storage medium

Publications (2)

Publication Number Publication Date
CN111405139A CN111405139A (en) 2020-07-10
CN111405139B true CN111405139B (en) 2023-10-17

Family

ID=71432929

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010222998.5A Active CN111405139B (en) 2020-03-26 2020-03-26 Time synchronization method, system, visual mileage system and storage medium

Country Status (1)

Country Link
CN (1) CN111405139B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113225481B (en) * 2021-04-30 2023-06-16 深圳市塞防科技有限公司 Camera image exposure time acquisition system, vehicle and method
CN114007060A (en) * 2021-09-30 2022-02-01 青岛歌尔声学科技有限公司 Image data and IMU data processing system, method, medium, and head-mounted device
CN116381468B (en) * 2023-06-05 2023-08-22 浙江瑞测科技有限公司 Method and device for supporting multi-chip parallel test by single image acquisition card

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105806334A (en) * 2016-03-07 2016-07-27 苏州中德睿博智能科技有限公司 Inertial sensor and vision sensor data synchronous acquisition system
CN106027909A (en) * 2016-07-05 2016-10-12 大连海事大学 System and method for synchronously collecting shipboard videos based on MEMS inertial sensor and camera
CN108645402A (en) * 2018-03-30 2018-10-12 深圳清创新科技有限公司 Camera shooting and inertia measurement sensing device, scene cut and pose computing system
CN109922260A (en) * 2019-03-04 2019-06-21 中国科学院上海微系统与信息技术研究所 The method of data synchronization and synchronizing device of image and inertial sensor
KR20190108307A (en) * 2018-03-14 2019-09-24 국방과학연구소 Method and apparatus for providing navigation data in inertial navigation system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10841496B2 (en) * 2017-10-19 2020-11-17 DeepMap Inc. Lidar to camera calibration based on edge detection

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105806334A (en) * 2016-03-07 2016-07-27 苏州中德睿博智能科技有限公司 Inertial sensor and vision sensor data synchronous acquisition system
CN106027909A (en) * 2016-07-05 2016-10-12 大连海事大学 System and method for synchronously collecting shipboard videos based on MEMS inertial sensor and camera
KR20190108307A (en) * 2018-03-14 2019-09-24 국방과학연구소 Method and apparatus for providing navigation data in inertial navigation system
CN108645402A (en) * 2018-03-30 2018-10-12 深圳清创新科技有限公司 Camera shooting and inertia measurement sensing device, scene cut and pose computing system
CN109922260A (en) * 2019-03-04 2019-06-21 中国科学院上海微系统与信息技术研究所 The method of data synchronization and synchronizing device of image and inertial sensor

Also Published As

Publication number Publication date
CN111405139A (en) 2020-07-10

Similar Documents

Publication Publication Date Title
CN111405139B (en) Time synchronization method, system, visual mileage system and storage medium
CN109104259B (en) Multi-sensor time synchronization system and method
CN109922260B (en) Data synchronization method and synchronization device for image sensor and inertial sensor
WO2021031604A1 (en) Method and device for hardware time synchronization between multi-channel imus and cameras of bionic eye
CN109729277B (en) Multi-sensor acquisition timestamp synchronization device
CN111860604B (en) Data fusion method, system and computer storage medium
EP4020855A1 (en) Time synchronization method and apparatus
WO2018228352A1 (en) Synchronous exposure method and apparatus and terminal device
WO2018228353A1 (en) Control method and apparatus for synchronous exposure of multi-camera system, and terminal device
CN104764442A (en) Method and device for determining exposure time of aerial photogrammetric camera in light-small unmanned aerial vehicle
CN110740227B (en) Camera time synchronization device and method based on GNSS time service and image display information coding mode
CN110139066A (en) A kind of Transmission system of sensing data, method and apparatus
CN111600670B (en) Inductive data calculation control method and time service device
CN113496545A (en) Data processing system, method, sensor, mobile acquisition backpack and equipment
US11819755B2 (en) VR system and positioning and tracking method of VR system
CN104748730A (en) Device and method for determining exposure moment of aerial survey camera in unmanned aerial vehicle
CN110139041B (en) Remote multi-sensing signal synchronous acquisition method
WO2018227329A1 (en) Synchronous exposure method and device, and terminal device
US11195297B2 (en) Method and system for visual localization based on dual dome cameras
CN107314770A (en) A kind of mobile robot and its master controller, alignment system and method
KR101402088B1 (en) Method for estimating location based on technology convergence of satellite navigation system and a vision system
CN111200698B (en) Remote multi-sensor multi-channel receiving method
WO2019080879A1 (en) Data processing method, computer device, and storage medium
WO2023165569A1 (en) Multi-sensor simultaneous positioning method and apparatus, system, and storage medium
CN112995524A (en) High-precision acquisition vehicle, and photo exposure information generation system, method and synchronization device thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200918

Address after: 610000 9, 3 building 200, Tianfu five street, hi tech Zone, Chengdu, Sichuan.

Applicant after: Qingke Xiaomei robot technology (Chengdu) Co.,Ltd.

Address before: 214000 13 / F, east side, building A1, No. 777, Jianshe West Road, Binhu District, Wuxi City, Jiangsu Province

Applicant before: SLIGHTECH INTELLIGENT SCIENCE & TECHNOLOGY (JIANGSU) Co.,Ltd.

GR01 Patent grant
GR01 Patent grant