CN111405139A - Time synchronization method, system, visual mileage system and storage medium - Google Patents

Time synchronization method, system, visual mileage system and storage medium Download PDF

Info

Publication number
CN111405139A
CN111405139A CN202010222998.5A CN202010222998A CN111405139A CN 111405139 A CN111405139 A CN 111405139A CN 202010222998 A CN202010222998 A CN 202010222998A CN 111405139 A CN111405139 A CN 111405139A
Authority
CN
China
Prior art keywords
image
time
image acquisition
module
mcu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010222998.5A
Other languages
Chinese (zh)
Other versions
CN111405139B (en
Inventor
李光耀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingke Xiaomei Robot Technology Chengdu Co ltd
Original Assignee
Slightech Intelligent Science & Technology Jiangsu Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Slightech Intelligent Science & Technology Jiangsu Co ltd filed Critical Slightech Intelligent Science & Technology Jiangsu Co ltd
Priority to CN202010222998.5A priority Critical patent/CN111405139B/en
Publication of CN111405139A publication Critical patent/CN111405139A/en
Application granted granted Critical
Publication of CN111405139B publication Critical patent/CN111405139B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • H04J3/0638Clock or time synchronisation among nodes; Internode synchronisation
    • H04J3/0658Clock or time synchronisation among packet nodes
    • H04J3/0661Clock or time synchronisation among packet nodes using timestamps

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electric Clocks (AREA)
  • Studio Devices (AREA)

Abstract

The invention relates to a time synchronization method, a time synchronization system, a visual mileage system and a storage medium, and belongs to the technical field of machine vision, wherein the method comprises the steps of setting an interruption period of an inertia measurement module to be T and setting a trigger period of an image to be nT; and the inertia measurement module at the moment of m × nT sends a trigger signal to the image acquisition module, and the image acquisition module receives the trigger signal and then controls the image imaging time to enable the imaging time of the image to be (m +1) × nT, wherein m and n are integers greater than or equal to 1. The method solves the problem of synchronization of the IMU and the image time stamp in the visual odometry system, flexibly controls the start and the end of image exposure, and eliminates the deviation between the image time stamp and the IMU in one step, thereby improving the precision and the stability of the visual odometry system to the maximum extent.

Description

Time synchronization method, system, visual mileage system and storage medium
Technical Field
The invention relates to the technical field of machine vision, in particular to a time synchronization method, a time synchronization system, a visual mileage system and a storage medium.
Background
Machine vision is one of the leading technologies at present, one of which is called a visual odometer (VIO) which is widely applied to VR/AR, unmanned and mobile robots as a space positioning method, for example, the VIO technology is used for space positioning in recent years by apple AR-Kit and google AR-Core which are hot.
In general, the VIO system ignores the time deviation between the IMU (inertial measurement unit) and the camera, and considers that the IMU and the camera are synchronized and aligned, however, due to the problems of trigger delay, transmission delay, inaccurate clock synchronization and the like of the hardware system, the time deviation between the IMU and the camera usually exists, and the elimination and correction of the deviation can effectively improve the performance of the VIO system.
The principle of the visual mileage calculation method is that the change information of the image feature point positions of two adjacent frames of images is calculated, so that the speed and the displacement between the two frames are calculated, then IMU data between the two frames of images are calculated in an integral mode, so that the movement speed and the displacement track are calculated, algorithm fusion is carried out on the two calculation results, and therefore more accurate speed and track are obtained.
Some camera images and IMUs on the existing world are two systems which separately and independently operate, timestamps are marked for the data after the images and the IMU data are received by an upper computer, the timestamps are extremely inaccurate, and excessive transmission delay errors and marking errors are introduced.
Although image timestamps and IMU timestamps are marked under the same time system, the image timestamps and IMU timestamps cannot be completely aligned or eliminated, even the deviation of the image timestamps and the IMU timestamps under different exposure times is extremely unstable, and therefore, a lot of uncontrollable errors are introduced into a visual odometer system. Uncertainty is introduced to the long-term stable operation of the whole system.
Disclosure of Invention
The invention aims to provide a time synchronization method which has the characteristics of accurate positioning and stable system.
The above object of the present invention is achieved by the following technical solutions:
a method of time synchronization, comprising:
if the interruption period of the inertia measurement module is T, the triggering period of the image is set to nT;
the inertia measurement module at the moment of m × nT sends a trigger signal to the image acquisition module, the image acquisition module receives the trigger signal and then controls the image imaging time to make the imaging time of the image be (m +1) × nT,
wherein m and n are integers greater than or equal to 1.
The method flexibly controls the start and the end of the image exposure, further eliminates the deviation between the image time stamp and the IMU time stamp, and improves the precision and the stability of the visual odometer system to the maximum extent.
The present invention in a preferred example may be further configured to: and setting an image acquisition module to trigger a synchronization mode and taking the middle moment of image exposure as a synchronization point.
The present invention in a preferred example may be further configured to: and sending the raw data of the inertia measurement module at the (n +1) T moment, the time stamp of the inertia measurement module, the image and the image time stamp to the host.
The invention also aims to provide a real-time synchronization and alignment system for the image and the IMU timestamp, which has the characteristic of accurate time.
The second aim of the invention is realized by the following technical scheme:
a time synchronization system comprises an inertial measurement module, an image acquisition module and a microprocessor module for implementing any one of the methods,
the data interrupt period generated by the inertia measurement module is T;
the triggering period of the image acquisition module is nT;
the microprocessor module is respectively in communication connection with the inertia measurement module and the image acquisition module,
and the inertia measurement module at the nT moment sends a trigger signal through the unprocessed module phase image acquisition module, and the image acquisition module receives the trigger signal and then controls the image imaging time to enable the imaging moment of the image to be (n +1) T.
The system is simple in structure, and the deviation between the image timestamp and the IMU is further eliminated by flexibly controlling the start and the end of the image exposure, so that the precision and the stability of the visual odometer system are improved to the maximum extent.
The present invention in a preferred example may be further configured to: the image acquisition module comprises an image acquisition unit and an ISP chip, the ISP chip is set to be a trigger mode, and the ISP chip trigger synchronization mode is set to use the middle of image exposure of the image acquisition unit as a synchronization point.
The present invention in a preferred example may be further configured to: the image acquisition unit comprises at least one camera.
The present invention in a preferred example may be further configured to: the microprocessor module comprises at least one MCU, and the MCU is respectively in communication connection with the inertia measurement module and the image acquisition module.
The present invention in a preferred example may be further configured to: the microprocessor module comprises a first MCU and a second MCU, the first MCU is in communication connection with the inertia measurement module and the image acquisition module and is used for acquiring an image timestamp, an IMU timestamp and IMU raw data and transmitting a trigger signal;
the second MCU is in communication connection with the image acquisition module and the second MUC and is used for acquiring images and receiving the image time stamp, the IMU time stamp and the IMU original data of the first MCU.
The invention aims to provide a visual odometer system which has the characteristic of stable system.
The third object of the invention is realized by the following technical scheme:
a visual odometry system comprising a memory and a processor, the memory having stored thereon a computer program that is loadable by the processor and that performs the method of real-time synchronization and alignment of the above-mentioned images with IMU timestamps.
The fourth object of the present invention is to provide a computer storage medium capable of storing a corresponding program and having a feature of facilitating real-time synchronization of an image and an IMU timestamp.
The fourth object of the invention is realized by the following technical scheme:
a computer readable storage medium storing a computer program capable of being loaded by a processor and executing any of the above methods for real-time synchronization and alignment of an image with an IMU timestamp.
In summary, the invention includes at least one of the following beneficial technical effects: the invention further eliminates the deviation between the image time stamp and the IMU time stamp by flexibly controlling the start and the end of the image exposure, thereby improving the precision and the stability of the visual odometer system to the maximum extent
Drawings
Fig. 1 is a system block diagram of a time synchronization system according to an embodiment of the present invention.
Fig. 2 is a flow chart of a time synchronization method according to an embodiment of the invention.
FIG. 3 is a timing diagram of an image and IMU according to one embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings.
The present embodiment is only for explaining the present invention, and it is not limited to the present invention, and those skilled in the art can make modifications of the present embodiment without inventive contribution as needed after reading the present specification, but all of them are protected by patent law within the scope of the claims of the present invention.
In addition, the term "and/or" herein is only one kind of association relationship describing an associated object, and means that there may be three kinds of relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship, unless otherwise specified.
Example 1: an embodiment of the present invention provides a time synchronization system, as shown in fig. 1, including an inertial measurement module, an image acquisition module, and a microprocessor module.
The inertial measurement module comprises an IMU chip, which may adopt ICM20602, BMI088 and other chips, and is used for three-axis attitude angle (or angular velocity) and acceleration of the object.
The image acquisition module comprises an image acquisition unit and an ISP chip, wherein the image acquisition unit can adopt one camera or more than two cameras, and two cameras are preferably adopted in the embodiment. The camera chip can adopt chips such as AR0144 and AR0135 for acquiring images in real time, and if the requirement on the images is high, a camera with a higher frame rate can be adopted.
The ISP chip can adopt an AP1302 and the like, is in communication connection with the two cameras and is used for controlling the exposure starting time and the exposure ending time of the two cameras and determining the intermediate time of the images.
The microprocessor module comprises a first MCU and a second MCU, the first MCU preferably adopts a common MCU, STM32, GD32 and other chips can be adopted, and the first MCU is respectively in communication connection with the ISP chip and the IMU chip. The second MCU is an image MCU which can be CYPRESS CY3014 or CY3065, and the second MCU is respectively in communication connection with the ISP chip and the first MCU.
The second MCU collects IMU original data, IMU time stamp, image and image time stamp and sends the IMU original data, the IMU time stamp and the image time stamp to a host with a visual mileage calculation method for operation through a USB.
In the embodiment of the invention, the communication among the chips can use i2c, uart, spi and the like. If the higher image frame rate is not required, the first MCU can be cancelled, and the second MCU is used for replacing the work of the first MCU. The data transmission mode between the second MCU and the host can be replaced by a USB interface.
Example 2: the embodiment of the invention provides a time synchronization method, which is mainly based on the time synchronization system, and the main flow of the method is described as follows.
As shown in fig. 2:
and S100, initializing the IMU chip by the first MCU, setting the IMU chip to be in an interrupt mode, and interrupting once every time a group of data is generated. And setting the interrupt period of the IMU chip to be T, wherein the image triggering period is nT and n is an integer. Wherein n is an integer greater than or equal to 1. If the IMU frequency is 200HZ and n is 4, then T is 1/200 and nT is 1/50, and the image frame rate is 50 HZ.
And S200, initializing an ISP chip by the first MCU, setting the ISP chip into a trigger mode, and setting the ISP chip to trigger a synchronous mode to take the middle moment of image exposure as a synchronous point.
S300, the IMU chip periodically generates data interruption, the first MCU reads IMU data after receiving an IMU data interruption signal, and sends a GPIO trigger signal to the ISP chip after the IMU data at the time of m x nT is interrupted, wherein m is an integer greater than or equal to 1.
And S400, the first MCU sends a GPIO trigger signal to the ISP chip, the ISP chip immediately controls the next image exposure after receiving the trigger signal, and automatically calculates the time for starting exposure and finishing exposure so that the interruption of the IMU chip corresponds to the intermediate time of the image exposure and the time of (m +1) × nT. The MCU will take the time at which the IMU triggered the interrupt at time (m +1) × nT as the timestamp common to the image and IMU at that moment.
And S500, the ISP chip starts exposure according to the control of the first MCU, acquires the middle time of image exposure, receives a data interruption of the (m +1) × nT time at the middle time when the first MCU receives the image exposure, the data interruption corresponds to the middle time of the image exposure, and the time of triggering interruption of the IMU chip at the time is the common timestamp of the exposure image and the IMU at the moment.
And S600, the first MCU sends the IMU original data, the IMU time stamp of the previous period and the image time stamp to the second MCU before the exposure is finished.
S700, the first MCU starts to receive the image at the time of (m +1) × nT.
And S800, the second MCU sends the received IMU time stamp, the image time stamp and the IMU data to a host with a visual mileage calculation method through a USB or a network interface for calculation.
In fig. 3, after the system is initially completed, the IMU chip sends two complete nT cycle trigger signals to the ISP chip, the first two nT cycles will not trigger it to capture an image, and in the third nT cycle, image capture begins.
Example 3: a visual odometry system comprising a memory and a processor, the memory having stored thereon a computer program that can be loaded by the processor and that executes the method of embodiment 2.
Example 4: a computer-readable storage medium storing a computer program capable of being loaded by a processor and executing the method of embodiment 2.

Claims (10)

1. A method of time synchronization, comprising:
if the interruption period of the inertia measurement module is T, the triggering period of the image is set to nT;
the inertia measurement module at the moment of m × nT sends a trigger signal to the image acquisition module, the image acquisition module receives the trigger signal and then controls the image imaging time to make the imaging time of the image be (m +1) × nT,
wherein m and n are integers greater than or equal to 1.
2. A time synchronization method according to claim 1, wherein the image acquisition module is arranged to trigger the synchronization mode with the intermediate time of the image exposure as the synchronization point.
3. A method of time synchronization according to claim 1, wherein the raw inertial measurement module data, the inertial measurement module time stamp, the image and the image time stamp at time (n +1) T are sent to the host.
4. A time synchronization system comprising an inertial measurement module, an image acquisition module and a microprocessor module for implementing the method of any one of claims 1 to 3,
the data interrupt period generated by the inertia measurement module is T;
the triggering period of the image acquisition module is nT;
the microprocessor module is respectively in communication connection with the inertia measurement module and the image acquisition module,
and the inertia measurement module at the moment of m × nT sends a trigger signal to the image acquisition module, and the image acquisition module receives the trigger signal and then controls the image imaging time to enable the imaging time of the image to be (m +1) × nT.
5. The time synchronization system of claim 4, wherein the image capturing module comprises an image capturing unit and an ISP chip, the ISP chip is set to trigger mode, and the ISP chip is set to trigger synchronization mode to use the middle of image exposure of the image capturing unit as a synchronization point.
6. A time synchronization system according to claim 4, wherein said image acquisition unit comprises at least one camera.
7. The time synchronization system according to claim 4, wherein the microprocessor module comprises at least one MCU, and the MCU is respectively connected with the inertial measurement module and the image acquisition module in a communication manner.
8. The time synchronization system according to claim 4, wherein the microprocessor module comprises a first MCU and a second MCU, the first MCU is in communication connection with the inertial measurement module and the image acquisition module, and is used for acquiring an image timestamp, an IMU timestamp and IMU raw data and transmitting a trigger signal;
the second MCU is in communication connection with the image acquisition module and the second MUC and is used for acquiring images and receiving the image time stamp, the IMU time stamp and the IMU original data of the first MCU.
9. A visual odometry system comprising a memory and a processor, the memory having stored thereon a computer program that can be loaded by the processor and that executes the method of any one of claims 1 to 3.
10. A computer-readable storage medium, in which a computer program is stored which can be loaded by a processor and which executes the method of any one of claims 1 to 3.
CN202010222998.5A 2020-03-26 2020-03-26 Time synchronization method, system, visual mileage system and storage medium Active CN111405139B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010222998.5A CN111405139B (en) 2020-03-26 2020-03-26 Time synchronization method, system, visual mileage system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010222998.5A CN111405139B (en) 2020-03-26 2020-03-26 Time synchronization method, system, visual mileage system and storage medium

Publications (2)

Publication Number Publication Date
CN111405139A true CN111405139A (en) 2020-07-10
CN111405139B CN111405139B (en) 2023-10-17

Family

ID=71432929

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010222998.5A Active CN111405139B (en) 2020-03-26 2020-03-26 Time synchronization method, system, visual mileage system and storage medium

Country Status (1)

Country Link
CN (1) CN111405139B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113225481A (en) * 2021-04-30 2021-08-06 深圳市道通智能汽车有限公司 Camera image exposure time acquisition system, vehicle and method
CN114007060A (en) * 2021-09-30 2022-02-01 青岛歌尔声学科技有限公司 Image data and IMU data processing system, method, medium, and head-mounted device
CN116381468A (en) * 2023-06-05 2023-07-04 浙江瑞测科技有限公司 Method and device for supporting multi-chip parallel test by single image acquisition card

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105806334A (en) * 2016-03-07 2016-07-27 苏州中德睿博智能科技有限公司 Inertial sensor and vision sensor data synchronous acquisition system
CN106027909A (en) * 2016-07-05 2016-10-12 大连海事大学 System and method for synchronously collecting shipboard videos based on MEMS inertial sensor and camera
CN108645402A (en) * 2018-03-30 2018-10-12 深圳清创新科技有限公司 Camera shooting and inertia measurement sensing device, scene cut and pose computing system
US20190120948A1 (en) * 2017-10-19 2019-04-25 DeepMap Inc. Lidar and camera synchronization
CN109922260A (en) * 2019-03-04 2019-06-21 中国科学院上海微系统与信息技术研究所 The method of data synchronization and synchronizing device of image and inertial sensor
KR20190108307A (en) * 2018-03-14 2019-09-24 국방과학연구소 Method and apparatus for providing navigation data in inertial navigation system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105806334A (en) * 2016-03-07 2016-07-27 苏州中德睿博智能科技有限公司 Inertial sensor and vision sensor data synchronous acquisition system
CN106027909A (en) * 2016-07-05 2016-10-12 大连海事大学 System and method for synchronously collecting shipboard videos based on MEMS inertial sensor and camera
US20190120948A1 (en) * 2017-10-19 2019-04-25 DeepMap Inc. Lidar and camera synchronization
KR20190108307A (en) * 2018-03-14 2019-09-24 국방과학연구소 Method and apparatus for providing navigation data in inertial navigation system
CN108645402A (en) * 2018-03-30 2018-10-12 深圳清创新科技有限公司 Camera shooting and inertia measurement sensing device, scene cut and pose computing system
CN109922260A (en) * 2019-03-04 2019-06-21 中国科学院上海微系统与信息技术研究所 The method of data synchronization and synchronizing device of image and inertial sensor

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113225481A (en) * 2021-04-30 2021-08-06 深圳市道通智能汽车有限公司 Camera image exposure time acquisition system, vehicle and method
CN114007060A (en) * 2021-09-30 2022-02-01 青岛歌尔声学科技有限公司 Image data and IMU data processing system, method, medium, and head-mounted device
CN116381468A (en) * 2023-06-05 2023-07-04 浙江瑞测科技有限公司 Method and device for supporting multi-chip parallel test by single image acquisition card
CN116381468B (en) * 2023-06-05 2023-08-22 浙江瑞测科技有限公司 Method and device for supporting multi-chip parallel test by single image acquisition card

Also Published As

Publication number Publication date
CN111405139B (en) 2023-10-17

Similar Documents

Publication Publication Date Title
CN111405139B (en) Time synchronization method, system, visual mileage system and storage medium
CN109104259B (en) Multi-sensor time synchronization system and method
CN109922260B (en) Data synchronization method and synchronization device for image sensor and inertial sensor
US10984554B2 (en) Monocular vision tracking method, apparatus and non-volatile computer-readable storage medium
CN107341831B (en) IMU (inertial measurement Unit) -assisted visual feature robust tracking method and device
CN109816696A (en) A kind of robot localization and build drawing method, computer installation and computer readable storage medium
US7184030B2 (en) Synchronization of cameras in camera-based touch system to enhance position determination of fast moving objects
EP2550798B1 (en) Synchronization of navigation and image information for handheld scanner
WO2018228352A1 (en) Synchronous exposure method and apparatus and terminal device
WO2018228353A1 (en) Control method and apparatus for synchronous exposure of multi-camera system, and terminal device
CN110139066B (en) Sensor data transmission system, method and device
CN110022444B (en) Panoramic photographing method for unmanned aerial vehicle and unmanned aerial vehicle using panoramic photographing method
EP4020855A1 (en) Time synchronization method and apparatus
CN111860604B (en) Data fusion method, system and computer storage medium
CN107948463B (en) Camera synchronization method, device and system
CN104764442A (en) Method and device for determining exposure time of aerial photogrammetric camera in light-small unmanned aerial vehicle
CN103516981A (en) Image pickup apparatus, image pickup system, image pickup method and computer readable non-transitory recording medium
CN113496545A (en) Data processing system, method, sensor, mobile acquisition backpack and equipment
WO2022142403A1 (en) Vr system and positioning and tracking method therefor
WO2018227329A1 (en) Synchronous exposure method and device, and terminal device
CN108988974A (en) Measurement method, device and the system to electronic equipment time synchronization of time delays
CN107314770A (en) A kind of mobile robot and its master controller, alignment system and method
CN110740227A (en) Camera time synchronization device and method based on GNSS time service and image display information coding mode
JP2019016869A (en) Imaging device, camera-equipped drone, mode control method, and program
CN113219479A (en) Camera and laser radar synchronization method and system of intelligent driving control system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200918

Address after: 610000 9, 3 building 200, Tianfu five street, hi tech Zone, Chengdu, Sichuan.

Applicant after: Qingke Xiaomei robot technology (Chengdu) Co.,Ltd.

Address before: 214000 13 / F, east side, building A1, No. 777, Jianshe West Road, Binhu District, Wuxi City, Jiangsu Province

Applicant before: SLIGHTECH INTELLIGENT SCIENCE & TECHNOLOGY (JIANGSU) Co.,Ltd.

GR01 Patent grant
GR01 Patent grant