WO2023226156A1 - 时间戳校正方法、装置、设备、介质及计算机程序产品 - Google Patents

时间戳校正方法、装置、设备、介质及计算机程序产品 Download PDF

Info

Publication number
WO2023226156A1
WO2023226156A1 PCT/CN2022/103380 CN2022103380W WO2023226156A1 WO 2023226156 A1 WO2023226156 A1 WO 2023226156A1 CN 2022103380 W CN2022103380 W CN 2022103380W WO 2023226156 A1 WO2023226156 A1 WO 2023226156A1
Authority
WO
WIPO (PCT)
Prior art keywords
key frame
motion data
frame image
motion
time difference
Prior art date
Application number
PCT/CN2022/103380
Other languages
English (en)
French (fr)
Inventor
蒿杰
詹恒泽
孙亚强
梁俊
史佳锋
Original Assignee
广东人工智能与先进计算研究院
芯跳科技(广州)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广东人工智能与先进计算研究院, 芯跳科技(广州)有限公司 filed Critical 广东人工智能与先进计算研究院
Publication of WO2023226156A1 publication Critical patent/WO2023226156A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2004Aligning objects, relative positioning of parts

Definitions

  • the present application relates to the field of sensor data processing technology, and in particular to a time stamp correction method, device, equipment, media and computer program product.
  • SLAM Simultaneous Localization and Mapping
  • a camera Inertial Measurement Unit
  • IMU Inertial Measurement Unit
  • the SLAM system samples the two sensors, obtains the corresponding image data and IMU data, and obtains the sampling timestamp corresponding to the image data and IMU data. It is usually assumed that the obtained sampling timestamp is the time corresponding to the sensor sampling moment.
  • This application provides a time stamp correction method, device, equipment, storage medium and computer program product to solve the problem of deviation in the time stamp between the camera and the IMU in the prior art and improve the calculation accuracy of the visual inertial odometry.
  • This application provides a timestamp correction method, including:
  • the time stamp of the key frame image and the time stamp of the motion data are corrected according to the time difference.
  • the steps of obtaining a camera image, selecting a key frame image in the camera image, and extracting motion data between adjacent key frame images include:
  • Motion data between adjacent key frame images is extracted according to the IMU data.
  • the step of pre-integrating the motion data to obtain the relative pose of the adjacent key frame images and the motion residual corresponding to the relative pose includes: :
  • the motion residual is calculated based on the position difference, the speed difference and the rotation angle difference.
  • the step of calculating the time difference between the key frame image and the motion data based on the landmark points corresponding to the key frame image and the motion residual includes:
  • the step of calculating the time difference between the key frame image and the motion data based on the reprojection error and the motion residual includes:
  • the time difference between the key frame image and the motion data is calculated.
  • the step of calculating the time difference between the key frame image and the motion data according to the optimization equation includes:
  • the time difference between the key frame image and the motion data is calculated.
  • This application also provides a timestamp correction device, including:
  • a motion data extraction module is used to obtain camera images, select key frame images in the camera images, and extract motion data between adjacent key frame images;
  • a pre-integration module used to pre-integrate the motion data to obtain the relative pose of the adjacent key frame images and the motion residual corresponding to the relative pose
  • a time difference calculation module configured to calculate the time difference between the key frame image and the motion data based on the landmark points corresponding to the key frame image and the motion residual;
  • a timestamp correction module configured to correct the timestamp of the key frame image and the timestamp of the motion data according to the time difference.
  • This application also provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor.
  • the processor executes the program, it implements any one of the above time stamp corrections. method.
  • This application also provides a non-transitory computer-readable storage medium on which a computer program is stored.
  • the computer program When executed by a processor, it implements any one of the above time stamp correction methods.
  • the present application also provides a computer program product, including a computer program that implements any one of the above time stamp correction methods when executed by a processor.
  • the timestamp correction method, device, equipment, storage medium and computer program product provided by this application obtain camera images, select key frame images in the camera images, and extract motion data between adjacent key frame images. By analyzing the motion The data is pre-integrated to obtain the relative pose and motion residuals. Based on the landmark points and motion residuals corresponding to the key frame images, the time difference between the key frame images and the motion data is calculated. Finally, the key frame images and motion data are calculated based on the time difference. Timestamp correction eliminates the timestamp deviation between the camera and the IMU. The deviation in the timestamp between the camera and the IMU will affect the work of the VIO system and reduce the calculation accuracy of the visual inertial odometry. Therefore, this application can improve the visual inertia Calculation accuracy of odometer.
  • Figure 1 is one of the flow diagrams of the timestamp correction method provided by this application.
  • Figure 2 is the second schematic flow chart of the timestamp correction method provided by this application.
  • FIG. 3 is a schematic structural diagram of the timestamp correction device provided by this application.
  • Figure 4 is a schematic structural diagram of an electronic device provided by this application.
  • This application provides a timestamp correction method, including:
  • Step S100 obtain a camera image, select a key frame image in the camera image, and extract motion data between adjacent key frame images;
  • a SLAM system usually includes two sensors: a camera and an IMU.
  • the SLAM system samples the two sensors, obtains the corresponding image data and IMU data, and obtains the timestamp of the data. It is usually assumed that the timestamp of the obtained data is the time corresponding to the sensor sampling moment.
  • problems such as trigger delay and transmission delay in the system cause a deviation between the timestamp of the camera and IMU and the real sampling time.
  • the stamp correction method is to solve the above problems. First, obtain camera images within a period of time at a certain frame rate. For example, when the image frame rate is 30hz, collect camera images within 3 minutes, a total of 5400 frames of camera images, and determine a frame every 5 frames.
  • the data collected by the IMU sensor is mainly acceleration in multiple directions and angular velocity in multiple directions, which represents changes in the motion state of the SLAM system.
  • Step S200 Pre-integrate the motion data to obtain the relative pose of the adjacent key frame images and the motion residual corresponding to the relative pose;
  • the relative pose of the SLAM system between two consecutive frames of key frame image i to key frame image i+1 can be obtained.
  • ⁇ p i,i+1 , ⁇ v i,i+1 , ⁇ R i,i+1 respectively represent the changes in position, speed and angle of the SLAM system between key frame image i and key frame image i+1.
  • the information matrix ⁇ I i,i+1 of the entire measurement vector can also be obtained.
  • the method of obtaining the IMU residual can be achieved through the following formula, where, represents the IMU residual, in, Represents the transpose of the corner information matrix of the SLAM system corresponding to key frame image i; Among them, g represents the acceleration of gravity;
  • Step S300 Calculate the time difference between the key frame image and the motion data according to the landmark point corresponding to the key frame image and the motion residual;
  • the landmark points corresponding to the key frame images refer to 3D points in the real world.
  • the time difference optimization problem between the camera and IMU can be expressed as a combination of IMU residuals and visual residuals, and the expression formula is Among them, K j represents the key frame image set in which the jth 3D landmark point is observed.
  • the timestamp correction method proposed in this embodiment uses the time difference between the camera and the IMU as part of the state variable.
  • t IMU t cam + dt
  • the timestamp of the camera image is dt larger than the real sampling timestamp of the corresponding data.
  • the timestamp corresponding to the K-th key frame image I k is t k
  • the real sampling time of the key frame image I k is t k -dt.
  • a feature point on I k For example, its image coordinates are u ij , and the feature points The true position at time t k has shifted, assuming that the feature point In the short time dt, the motion in the image plane is uniform motion, then at time t k , the feature point The estimated coordinates are in, as feature points
  • the motion speed within the image pixel plane can be estimated by the positions of the feature points of the two key frames before and after, that is, Based on the above assumptions, in the VIO system, the feature point coordinates with parameter dt corrected by time deviation are By replacing the previous feature point coordinates, the time deviation parameter dt is introduced into the above optimization equation, so the above optimization equation can be written in a form with time difference, that is,
  • the vertices and edges in the graph optimization process are constructed.
  • the vertices represent the unknown variables to be optimized, and the edges are the residual formulas connecting the optimization variables, and the time difference dt is calculated through optimization.
  • Step S400 Correct the time stamp of the key frame image and the time stamp of the motion data according to the time difference.
  • the timestamp of the key frame image and the timestamp of the motion data are corrected based on the calculated time difference, thereby eliminating the impact of the timestamp deviation on the VIO system and improving vision. Calculation accuracy of inertial odometry.
  • This embodiment obtains camera images, selects key frame images in the camera image, and extracts motion data between adjacent key frame images. By pre-integrating the motion data, the relative pose and motion residuals are obtained. According to the key frames The landmark points and motion residuals corresponding to the image are used to calculate the time difference between the key frame image and the motion data. Finally, the key frame image and motion data are timestamp corrected based on the time difference, eliminating the timestamp deviation between the camera and the IMU. The deviation in the timestamp between the camera and the IMU will affect the work of the VIO system and reduce the calculation accuracy of the visual inertial odometry. Therefore, this application can improve the calculation accuracy of the visual inertial odometry.
  • the timestamp correction method provided by the embodiment of this application may also include:
  • Step S101 Obtain camera images according to the first preset collection frequency and preset time period, and select key frame images in the camera images according to preset rules;
  • Step S102 obtain inertial sensor IMU data according to the second preset collection frequency and the preset period
  • Step S103 Extract motion data between adjacent key frame images according to the IMU data.
  • the camera images within a period of time are acquired at a certain frame rate.
  • the image frame rate i.e., the first preset acquisition frequency in this embodiment
  • the camera images are collected for 3 minutes (i.e., in this embodiment, the In the example, the camera images within the preset period)
  • the second preset acquisition frequency collects the IMU data within the preset period.
  • the second preset acquisition frequency is greater than the first The acquisition frequency is preset, and a total of 5400 frames of camera images are collected within 3 minutes.
  • the IMU data within the moment corresponding to the frame image is the motion data in this embodiment.
  • the data collected by the IMU sensor is mainly acceleration in multiple directions and angular velocity in multiple directions, which represents changes in the motion state of the SLAM system.
  • This embodiment collects camera images and IMU data within a certain period of time at a certain frequency, calculates the time difference between key frame images and motion data based on the collected images and IMU data, and finally timestamps the key frame images and motion data based on the time difference. Correction eliminates the timestamp deviation between the camera and the IMU and improves the calculation accuracy of the visual inertial odometry.
  • the timestamp correction method provided by the embodiment of this application may also include:
  • Step S201 pre-integrate the motion data to obtain the relative pose of the adjacent key frame images, where the relative pose includes position difference, speed difference and rotation angle difference;
  • Step S202 Calculate motion residuals based on the position difference, the speed difference and the rotation angle difference.
  • the relative pose of the SLAM system between two consecutive frames of key frame image i to key frame image i+1 can be obtained by performing a pre-integration operation on the IMU data.
  • the relative pose includes ⁇ p i,i+ 1 , ⁇ v i,i+1 , and ⁇ R i,i+1 , where ⁇ p i,i+1 represents the change in the position of the SLAM system between key frame image i and key frame image i+1, that is, in this embodiment
  • the position difference of The change in the angle of the SLAM system between i and the key frame image i+1, that is, the angle difference in this embodiment is calculated based on the position difference, speed difference and angle difference using the following formula, in, Represents the transpose of the corner information matrix of the SLAM system corresponding to key frame image i; Among them, g represents the acceleration of gravity;
  • This embodiment calculates motion residuals through pre-integration, and performs timestamp correction on key frame images and motion data based on the motion residuals, thereby eliminating the timestamp deviation between the camera and the IMU and improving the calculation accuracy of the visual inertial odometry.
  • the timestamp correction method provided by the embodiment of this application may also include:
  • Step S310 Calculate the reprojection error between the key frame image and the landmark point according to the landmark point corresponding to the key frame image;
  • Step S320 Calculate the time difference between the key frame image and the motion data based on the reprojection error and the motion residual.
  • This embodiment calculates the time difference between the key frame image and the motion data based on the landmark points and motion residuals corresponding to the key frame image, eliminating the timestamp deviation between the camera and the IMU, and improving the calculation accuracy of the visual inertial odometry.
  • the timestamp correction method provided by the embodiment of the present application may also include:
  • Step S321 obtain the first information matrix corresponding to the relative pose and the second information matrix corresponding to the reprojection error
  • Step S322 determine an optimization equation based on the reprojection error, the motion residual, the first information matrix, and the second information matrix;
  • Step S323 Calculate the time difference between the key frame image and the motion data according to the optimization equation.
  • the timestamp correction method uses the time difference between the camera and the IMU as part of the state variable.
  • t IMU t cam + dt
  • the timestamp of the camera image is dt larger than the real sampling timestamp of the corresponding data.
  • the timestamp corresponding to the K-th key frame image I k is t k
  • the real sampling time of the key frame image I k is t k -dt.
  • a feature point on I k For example, its image coordinates are u ij , and the feature points The true position at time t k has shifted, assuming that the feature point In the short time dt, the motion in the image plane is uniform motion, then at time t k , the feature point The estimated coordinates are in, as feature points
  • the motion speed within the image pixel plane can be estimated by the positions of the feature points of the two key frames before and after, that is, Based on the above assumptions, in the VIO system, the feature point coordinates with parameter dt corrected by time deviation are By replacing the previous feature point coordinates, the time deviation parameter dt is introduced into the above optimization equation, so the above optimization equation can be written in a form with time difference, that is, Among them, ⁇ I i,i+1 is the first information matrix, and ⁇ ij is the second information matrix.
  • This embodiment calculates the time difference between the key frame image and the motion data based on the reprojection error, the motion residual, the first information matrix and the second information matrix, eliminating the timestamp deviation between the camera and the IMU, and improving the visual inertial mileage Calculation accuracy of the meter.
  • the timestamp correction method provided by the embodiment of this application may also include:
  • Step S3231 Substitute the vertex coordinates and edge feature points of the key frame image into the optimization equation to calculate the time difference between the key frame image and the motion data.
  • the vertices in the key frame image are used as unknown variables to be optimized that are the same as the time difference dt.
  • the edges are used as the residual formula connecting the optimization variables. They are substituted into the optimization equation and optimized to calculate Time difference dt.
  • the time stamp deviation between the camera and the IMU is eliminated, and the calculation accuracy of the visual inertial odometry is improved.
  • the timestamp correction device provided by the present application is described below.
  • the timestamp correction device described below and the timestamp correction method described above can be referred to correspondingly.
  • This application also provides a timestamp correction device, including:
  • the motion data extraction module 10 is used to obtain camera images, select key frame images in the camera images, and extract motion data between adjacent key frame images;
  • the pre-integration module 20 is used to pre-integrate the motion data to obtain the relative pose of the adjacent key frame images and the motion residual corresponding to the relative pose;
  • the time difference calculation module 30 is used to calculate the time difference between the key frame image and the motion data based on the landmark points corresponding to the key frame image and the motion residual;
  • the time stamp correction module 40 is configured to correct the time stamp of the key frame image and the time stamp of the motion data according to the time difference.
  • the motion data extraction module includes:
  • An image acquisition unit configured to acquire camera images according to a first preset acquisition frequency and a preset time period, and select key frame images in the camera images according to preset rules
  • IMU data acquisition unit configured to acquire inertial sensor IMU data according to the second preset collection frequency and the preset period
  • a motion data extraction unit configured to extract motion data between adjacent key frame images according to the IMU data.
  • the pre-integration module includes:
  • a pre-integration unit configured to pre-integrate the motion data to obtain the relative pose of the adjacent key frame images, where the relative pose includes position difference, speed difference and rotation angle difference;
  • a motion residual calculation unit configured to calculate a motion residual based on the position difference, the speed difference and the rotation angle difference.
  • the time difference calculation module includes:
  • a reprojection error calculation unit configured to calculate the reprojection error between the key frame image and the landmark point according to the landmark point corresponding to the key frame image
  • a first time difference calculation unit configured to calculate the time difference between the key frame image and the motion data based on the reprojection error and the motion residual.
  • the first time difference calculation unit includes:
  • An information matrix acquisition unit configured to acquire the first information matrix corresponding to the relative pose and the second information matrix corresponding to the reprojection error
  • An optimization equation determination unit configured to determine an optimization equation based on the reprojection error, the motion residual, the first information matrix, and the second information matrix;
  • a second time difference calculation unit is configured to calculate the time difference between the key frame image and the motion data according to the optimization equation.
  • the second time difference calculation unit includes:
  • a third time difference calculation unit is used to substitute the vertex coordinates and edge feature points of the key frame image into the optimization equation to calculate the time difference between the key frame image and the motion data.
  • Figure 4 illustrates a schematic diagram of the physical structure of an electronic device.
  • the electronic device may include: a processor (processor) 410, a communications interface (Communications Interface) 420, a memory (memory) 430 and a communication bus 440.
  • the processor 410, the communication interface 420, and the memory 430 complete communication with each other through the communication bus 440.
  • the processor 410 can call logical instructions in the memory 430 to perform a timestamp correction method, which method includes: acquiring a camera image, selecting a key frame image in the camera image, and extracting motion data between adjacent key frame images; Pre-integrate the motion data to obtain the relative poses of the adjacent key frame images and the motion residuals corresponding to the relative poses; according to the landmark points corresponding to the key frame images and the motion residuals , calculate the time difference between the key frame image and the motion data; correct the time stamp of the key frame image and the time stamp of the motion data according to the time difference.
  • a timestamp correction method includes: acquiring a camera image, selecting a key frame image in the camera image, and extracting motion data between adjacent key frame images; Pre-integrate the motion data to obtain the relative poses of the adjacent key frame images and the motion residuals corresponding to the relative poses; according to the landmark points corresponding to the key frame images and the motion residuals , calculate the time difference between the key frame image and the motion data; correct the time stamp of the key frame
  • the above-mentioned logical instructions in the memory 430 can be implemented in the form of software functional units and can be stored in a computer-readable storage medium when sold or used as an independent product.
  • the technical solution of the present application is essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product.
  • the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in various embodiments of this application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disk and other media that can store program code. .
  • the present application also provides a computer program product.
  • the computer program product includes a computer program.
  • the computer program can be stored on a non-transitory computer-readable storage medium.
  • the computer can Execute the timestamp correction method provided by each of the above methods.
  • the method includes: acquiring a camera image, selecting a key frame image in the camera image, extracting motion data between adjacent key frame images; and pre-processing the motion data. Integrate to obtain the relative poses of the adjacent key frame images and the motion residuals corresponding to the relative poses; calculate the key frame images based on the landmark points corresponding to the key frame images and the motion residuals. and the time difference between the motion data and the time difference; correcting the time stamp of the key frame image and the time stamp of the motion data according to the time difference.
  • the present application also provides a non-transitory computer-readable storage medium on which a computer program is stored.
  • the computer program is implemented when executed by a processor to perform the timestamp correction method provided by each of the above methods.
  • the method includes : Acquire a camera image, select a key frame image in the camera image, and extract motion data between adjacent key frame images; pre-integrate the motion data to obtain the relative pose of the adjacent key frame images, and the motion residual corresponding to the relative pose; calculate the time difference between the key frame image and the motion data according to the landmark point corresponding to the key frame image and the motion residual; calculate the time difference between the key frame image and the motion data according to the time difference The time stamp of the key frame image and the time stamp of the motion data are corrected.
  • the device embodiments described above are only illustrative.
  • the units described as separate components may or may not be physically separated.
  • the components shown as units may or may not be physical units, that is, they may be located in One location, or it can be distributed across multiple network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment. Persons of ordinary skill in the art can understand and implement the method without any creative effort.
  • each embodiment can be implemented by software plus a necessary general hardware platform, and of course, it can also be implemented by hardware.
  • the computer software product can be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., including a number of instructions to cause a computer device (which can be a personal computer, a server, or a network device, etc.) to execute the methods described in various embodiments or certain parts of the embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Architecture (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)

Abstract

一种时间戳校正方法、装置、设备、介质及计算机程序产品,该方法包括:获取相机图像,选取相机图像中的关键帧图像,提取相邻关键帧图像之间的运动数据(S100);对运动数据进行预积分,得到相邻关键帧图像的相对位姿,以及相对位姿对应的运动残差(S200);根据关键帧图像对应的路标点和运动残差,计算关键帧图像与运动数据之间的时差(S300);根据时差对关键帧图像的时间戳和运动数据的时间戳进行校正(S400)。该时间戳校正方法、装置、设备、介质及计算机程序产品用以解决现有技术中相机和IMU之间的时间戳存在偏差的缺陷,提高视觉惯性里程计的计算精度。

Description

时间戳校正方法、装置、设备、介质及计算机程序产品
相关申请的交叉引用
本申请要求于2022年05月24日提交的、申请号为202210576177.0、发明名称为“时间戳校正方法、装置、设备、介质及计算机程序产品”的中国专利申请的优先权,其通过引用方式全部并入本文。
技术领域
本申请涉及传感器数据处理技术领域,尤其涉及一种时间戳校正方法、装置、设备、介质及计算机程序产品。
背景技术
在SLAM(Simultaneous Localization and Mapping,同步定位与建图)系统中,通常包含相机与IMU(Inertial Measurement Unit,惯性测量单元)这两种传感器。SLAM系统对这两个传感器进行采样,获得相应的图像数据和IMU数据,以及获得图像数据和IMU数据对应的采样时间戳。通常假设获得的采样时间戳即为传感器采样时刻对应的时间,然而系统存在的触发延时、传输延时和没有准确同步时钟等问题,使得相机和IMU之间的时间戳存在偏差,而相机与IMU数据流时间戳上的偏差将影响VIO(visual-inertial odometry,视觉惯性里程计)系统的工作,从而降低视觉惯性里程计的计算精度。
发明内容
本申请提供一种时间戳校正方法、装置、设备、存储介质及计算机程序产品,用以解决现有技术中相机和IMU之间的时间戳存在偏差的缺陷,提高视觉惯性里程计的计算精度。
本申请提供一种时间戳校正方法,包括:
获取相机图像,选取所述相机图像中的关键帧图像,提取相邻关键帧图像之间的运动数据;
对所述运动数据进行预积分,得到所述相邻关键帧图像的相对位姿, 以及所述相对位姿对应的运动残差;
根据所述关键帧图像对应的路标点和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差;
根据所述时差对所述关键帧图像的时间戳和所述运动数据的时间戳进行校正。
根据本申请提供的一种时间戳校正方法,所述获取相机图像,选取所述相机图像中的关键帧图像,提取相邻关键帧图像之间的运动数据的步骤包括:
根据第一预设采集频率和预设时段获取相机图像,根据预设规则选取所述相机图像中的关键帧图像;
根据第二预设采集频率和所述预设时段获取惯性传感器IMU数据;
根据所述IMU数据提取相邻关键帧图像之间的运动数据。
根据本申请提供的一种时间戳校正方法,所述对所述运动数据进行预积分,得到所述相邻关键帧图像的相对位姿,以及所述相对位姿对应的运动残差的步骤包括:
对所述运动数据进行预积分,得到所述相邻关键帧图像的相对位姿,其中,所述相对位姿包括位置差、速度差和转角差;
根据所述位置差、所述速度差和所述转角差计算运动残差。
根据本申请提供的一种时间戳校正方法,所述根据所述关键帧图像对应的路标点和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差的步骤包括:
根据所述关键帧图像对应的路标点,计算所述关键帧图像与所述路标点之间的重投影误差;
根据所述重投影误差和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差。
根据本申请提供的一种时间戳校正方法,所述根据所述重投影误差和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差的步骤包括:
获取所述相对位姿对应的第一信息矩阵,以及所述重投影误差对应的第二信息矩阵;
根据所述重投影误差、所述运动残差、所述第一信息矩阵以及所述第二信息矩阵,确定优化方程;
根据所述优化方程,计算所述关键帧图像与所述运动数据之间的时差。
根据本申请提供的一种时间戳校正方法,所述根据所述优化方程,计算所述关键帧图像与所述运动数据之间的时差的步骤包括:
将所述关键帧图像的顶点坐标和边特征点代入所述优化方程,计算得到所述关键帧图像与所述运动数据之间的时差。
本申请还提供一种时间戳校正装置,包括:
运动数据提取模块,用于获取相机图像,选取所述相机图像中的关键帧图像,提取相邻关键帧图像之间的运动数据;
预积分模块,用于对所述运动数据进行预积分,得到所述相邻关键帧图像的相对位姿,以及所述相对位姿对应的运动残差;
时差计算模块,用于根据所述关键帧图像对应的路标点和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差;
时间戳校正模块,用于根据所述时差对所述关键帧图像的时间戳和所述运动数据的时间戳进行校正。
本申请还提供一种电子设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现如上述任一种所述时间戳校正方法。
本申请还提供一种非暂态计算机可读存储介质,其上存储有计算机程序,该计算机程序被处理器执行时实现如上述任一种所述时间戳校正方法。
本申请还提供一种计算机程序产品,包括计算机程序,所述计算机程序被处理器执行时实现如上述任一种所述时间戳校正方法。
本申请提供的时间戳校正方法、装置、设备、存储介质及计算机程序产品,通过获取相机图像,选取相机图像中的关键帧图像,并提取相邻关键帧图像之间的运动数据,通过对运动数据进行预积分,得到相对位姿与运动残差,根据关键帧图像对应的路标点和运动残差,计算关键帧图像与运动数据之间的时差,最终根据时差对关键帧图像和运动数据进行时间戳校正,消除了相机与IMU之间的时间戳偏差,而相机与IMU在时间戳上的偏差将影响VIO系统的工作,降低视觉惯性里程计的计算精度,因此, 本申请可以提高视觉惯性里程计的计算精度。
附图说明
为了更清楚地说明本申请或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请提供的时间戳校正方法的流程示意图之一;
图2是本申请提供的时间戳校正方法的流程示意图之二;
图3是本申请提供的时间戳校正装置的结构示意图;
图4是本申请提供的电子设备的结构示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合本申请中的附图,对本申请中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
下面结合图1-图2描述本申请的时间戳校正方法。
请参照图1,本申请提供一种时间戳校正方法,包括:
步骤S100,获取相机图像,选取所述相机图像中的关键帧图像,提取相邻关键帧图像之间的运动数据;
具体地,在SLAM系统中,通常包含相机与IMU这两种传感器。SLAM系统对这两个传感器进行采样,获得相应的图像数据和IMU数据,以及获得数据的时间戳。通常假设获得数据的时间戳为传感器采样时刻对应的时间,然而系统存在的触发延时和传输延时等问题,使得相机和IMU的时间戳与真实采样时间之间存在偏差,本申请提出的时间戳校正方法即是为了解决上述问题。首先,以一定的帧率获取一段时间段内的相机图像,例如,在图像帧率为30hz的情况下,采集3分钟之内的相机图像,共5400帧相机图像,按照每隔5帧确定一张关键帧图像的规则,从这些相机图像中选取一定数量的关 键帧图像,IMU传感器的采集频率大于图像帧率,然后提取相邻的关键帧图像对应的时刻之内的IMU数据,即本实施例中的运动数据,IMU传感器采集到的数据主要是多个方向的加速度和多个方向的角速度,表示SLAM系统的运动状态的变化。
步骤S200,对所述运动数据进行预积分,得到所述相邻关键帧图像的相对位姿,以及所述相对位姿对应的运动残差;
对于IMU传感器中的陀螺仪来说,需要被估计的数据有
Figure PCTCN2022103380-appb-000001
Figure PCTCN2022103380-appb-000002
其中T i=[R i,P i]∈SE(3),其中,v i为关键帧图像i对应的SLAM系统在获得关键帧图像i时的速度,
Figure PCTCN2022103380-appb-000003
分别为陀螺仪和加速度计的偏差,对于SLAM系统而言,通过对IMU数据进行预积分操作,可以获得连续两帧关键帧图像i到关键帧图像i+1之间SLAM系统的相对位姿,Δp i,i+1,Δv i,i+1,ΔR i,i+1分别表示关键帧图像i到关键帧图像i+1之间SLAM系统位置的变化,速度的变化以及转角的变化。还可获得整个测量向量的信息矩阵∑I i,i+1
具体地,得到IMU残差的方法可以通过如下公式实现,其中,
Figure PCTCN2022103380-appb-000004
表示IMU残差,
Figure PCTCN2022103380-appb-000005
其中,
Figure PCTCN2022103380-appb-000006
表示关键帧图像i对应的SLAM系统的转角信息矩阵的转置;
Figure PCTCN2022103380-appb-000007
Figure PCTCN2022103380-appb-000008
其中,g表示重力加速度;
Figure PCTCN2022103380-appb-000009
Figure PCTCN2022103380-appb-000010
步骤S300,根据所述关键帧图像对应的路标点和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差;
具体地,关键帧图像对应的路标点是指现实世界中的3D点,关键帧图像i与3D点X j之间存在视觉残差,定义视觉残差为r ij,则r ij=u ij-∏(T wcX j),其中,u ij是3D路标点X j在关键帧图像i上的投影,其信息矩阵为∑ ij,T wc表示世界坐标系到相机坐标系的转换,给定k+1个关键帧图像,及各关键帧图像的状态量S k={S 0,…S k},同时给定L个3D路标点集,3D路标点集的状态量为x={x 0,…x (l-1)},于是,相机与IMU的时差优化问题就可以表示为IMU残差以及视觉残差的组合,表示公式为
Figure PCTCN2022103380-appb-000011
Figure PCTCN2022103380-appb-000012
其中,K j表示观测到第j个3D路标点的关键帧图 像集合。
本实施例提出的时间戳校正方法将相机与IMU的时差作为状态变量的一部分。在IMU与相机存在时间偏差dt的情况下,令t IMU=t cam+dt,那么相机图像的时间戳比对应数据真实的采样时间戳大dt。假设,第K帧关键帧图像I k对应的时间戳为t k,那么,关键帧图像I k真实的采样时间为t k-dt。对于I k上的一个特征点
Figure PCTCN2022103380-appb-000013
来说,其图像坐标为u ij,特征点
Figure PCTCN2022103380-appb-000014
在t k时间的真实位置已经发生偏移,假设,特征点
Figure PCTCN2022103380-appb-000015
在短时间dt内在图像平面内的运动为匀速运动,那么在t k时刻,特征点
Figure PCTCN2022103380-appb-000016
估计的坐标为
Figure PCTCN2022103380-appb-000017
其中,
Figure PCTCN2022103380-appb-000018
为特征点
Figure PCTCN2022103380-appb-000019
在图像像素平面内的运动速度,可以通过前后两个关键帧的特征点的位置进行估计,即,
Figure PCTCN2022103380-appb-000020
基于上述假设,在VIO系统中,将经时间偏差校正后的带参数dt的特征点坐标
Figure PCTCN2022103380-appb-000021
替换之前的特征点坐标,从而将时间偏差参数dt引入了上述优化方程,于是上述优化方程可以写成带时间差的形式,即,
Figure PCTCN2022103380-appb-000022
Figure PCTCN2022103380-appb-000023
构建图优化过程中的顶点和边,顶点表示待优化的未知变量,边为连接优化变量之间的残差公式,并且进行优化计算出时差dt。
步骤S400,根据所述时差对所述关键帧图像的时间戳和所述运动数据的时间戳进行校正。
具体地,根据上述内容计算得到相机与IMU的时差之后,根据计算得到的时差,对关键帧图像的时间戳和运动数据的时间戳进行校正,从而消除时间戳偏差对VIO系统的影响,提高视觉惯性里程计的计算精度。
本实施例通过获取相机图像,选取相机图像中的关键帧图像,并提取相邻关键帧图像之间的运动数据,通过对运动数据进行预积分,得到相对位姿与运动残差,根据关键帧图像对应的路标点和运动残差,计算关键帧图像与运动数据之间的时差,最终根据时差对关键帧图像和运动数据进行时间戳校正,消除了相机与IMU之间的时间戳偏差,而相机与IMU在时间戳上的偏差将影响VIO系统的工作,降低视觉惯性里程计的计算精度,因此,本申请可以提高视觉惯性里程计的计算精度。
在一个实施例中,本申请实施例提供的时间戳校正方法,还可以包括:
步骤S101,根据第一预设采集频率和预设时段获取相机图像,根据预设规则选取所述相机图像中的关键帧图像;
步骤S102,根据第二预设采集频率和所述预设时段获取惯性传感器IMU数据;
步骤S103,根据所述IMU数据提取相邻关键帧图像之间的运动数据。
具体地,以一定的帧率获取一段时间段内的相机图像,例如,在图像帧率(即本实施例中的第一预设采集频率)为30hz的情况下,采集3分钟(即本实施例中的预设时段)之内的相机图像,在第二预设采集频率的基础上,采集预设时段内的IMU数据,根据相机和IMU的特性可知,第二预设采集频率大于第一预设采集频率,而3分钟内共采集到5400帧相机图像,按照每隔5帧确定一张关键帧图像的规则,从这些相机图像中选取一定数量的关键帧图像,然后提取相邻的关键帧图像对应的时刻之内的IMU数据,即本实施例中的运动数据,IMU传感器采集到的数据主要是多个方向的加速度和多个方向的角速度,表示SLAM系统的运动状态的变化。
本实施例通过一定频率采集一定时段内的相机图像和IMU数据,根据采集到的图像和IMU数据计算关键帧图像与运动数据之间的时差,最终根据时差对关键帧图像和运动数据进行时间戳校正,消除了相机与IMU之间的时间戳偏差,提高视觉惯性里程计的计算精度。
在一个实施例中,本申请实施例提供的时间戳校正方法,还可以包括:
步骤S201,对所述运动数据进行预积分,得到所述相邻关键帧图像的相对位姿,其中,所述相对位姿包括位置差、速度差和转角差;
步骤S202,根据所述位置差、所述速度差和所述转角差计算运动残差。
具体地,对于SLAM系统而言,对IMU数据进行预积分操作可以获得连续两帧关键帧图像i到关键帧图像i+1之间SLAM系统的相对位姿,相对位姿包括Δp i,i+1,Δv i,i+1,和ΔR i,i+1,其中,Δp i,i+1表示关键帧图像i到关键帧图像i+1之间SLAM系统位置的变化,即本实施例中的位置差;Δv i,i+1表示关键帧图像i到关键帧图像i+1之间SLAM系统速度的变化,即本实施例中的速度差;以及ΔR i,i+1表示关键帧图像i到关键帧图像i+1之间SLAM系统转角的变化,即本实施例中的转角差,通过以下公式据位置差、速度差和转角差计算运动残差,
Figure PCTCN2022103380-appb-000024
其中,
Figure PCTCN2022103380-appb-000025
表示关键帧图像i对应的SLAM系统的转角信息矩阵的转置;
Figure PCTCN2022103380-appb-000026
Figure PCTCN2022103380-appb-000027
其中,g表示重力加速度;
Figure PCTCN2022103380-appb-000028
Figure PCTCN2022103380-appb-000029
本实施例通过预积分计算运动残差,根据运动残差对关键帧图像和运动数据进行时间戳校正,消除了相机与IMU之间的时间戳偏差,提高视觉惯性里程计的计算精度。
在一个实施例中,本申请实施例提供的时间戳校正方法,还可以包括:
步骤S310,根据所述关键帧图像对应的路标点,计算所述关键帧图像与所述路标点之间的重投影误差;
步骤S320,根据所述重投影误差和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差。
具体地,定义重投影误差(即上述视觉残差)为r ij,则r ij=u ij-∏(T wcX j),其中,u ij是路标点X j在关键帧图像i上的投影,其信息矩阵为∑ ij,T wc表示世界坐标系到相机坐标系的转换,给定k+1个关键帧图像,及各关键帧图像的状态量S k={S 0,…S k},同时给定L个3D路标点集,3D路标点集的状态量为x={x 0,…x (l-1)},于是,相机与IMU的优化问题就可以表示为IMU残差以及重投影误差的组合,表示公式为
Figure PCTCN2022103380-appb-000030
Figure PCTCN2022103380-appb-000031
构建图优化过程中的顶点和边,顶点表示待优化的未知变量,边为连接优化变量之间的残差公式,并且进行优化计算出时差dt。
本实施例根据关键帧图像对应的路标点和运动残差,计算关键帧图像与运动数据之间的时差,消除了相机与IMU之间的时间戳偏差,提高了视觉惯性里程计的计算精度。
请参照图2,在一个实施例中,本申请实施例提供的时间戳校正方法,还可以包括:
步骤S321,获取所述相对位姿对应的第一信息矩阵,以及所述重投影误差对应的第二信息矩阵;
步骤S322,根据所述重投影误差、所述运动残差、所述第一信息矩阵以及所述第二信息矩阵,确定优化方程;
步骤S323,根据所述优化方程,计算所述关键帧图像与所述运动数据之间的时差。
本实施例提供的时间戳校正方法将相机与IMU的时差作为状态变量的一部分。在IMU与相机存在时间偏差dt的情况下,令t IMU=t cam+dt,那么相机图像的时间戳比对应数据真实的采样时间戳大dt。假设,第K帧关键帧图像I k对应的时间戳为t k,那么,关键帧图像I k真实的采样时间为t k-dt。对于I k上的一个特征点
Figure PCTCN2022103380-appb-000032
来说,其图像坐标为u ij,特征点
Figure PCTCN2022103380-appb-000033
在t k时间的真实位置已经发生偏移,假设,特征点
Figure PCTCN2022103380-appb-000034
在短时间dt内在图像平面内的运动为匀速运动,那么在t k时刻,特征点
Figure PCTCN2022103380-appb-000035
估计的坐标为
Figure PCTCN2022103380-appb-000036
其中,
Figure PCTCN2022103380-appb-000037
为特征点
Figure PCTCN2022103380-appb-000038
在图像像素平面内的运动速度,可以通过前后两个关键帧的特征点的位置进行估计,即,
Figure PCTCN2022103380-appb-000039
基于上述假设,在VIO系统中,将经时间偏差校正后的带参数dt的特征点坐标
Figure PCTCN2022103380-appb-000040
替换之前的特征点坐标,从而将时间偏差参数dt引入了上述优化方程,于是上述优化方程可以写成带时间差的形式,即,
Figure PCTCN2022103380-appb-000041
Figure PCTCN2022103380-appb-000042
其中,∑I i,i+1为第一信息矩阵,∑ ij为第二信息矩阵。
本实施例根据重投影误差、运动残差、第一信息矩阵以及第二信息矩阵计算关键帧图像与运动数据之间的时差,消除了相机与IMU之间的时间戳偏差,提高了视觉惯性里程计的计算精度。
在一个实施例中,本申请实施例提供的时间戳校正方法,还可以包括:
步骤S3231,将所述关键帧图像的顶点坐标和边特征点代入所述优化方程,计算得到所述关键帧图像与所述运动数据之间的时差。
构建图优化过程中的顶点和边,将关键帧图像中的顶点作为与时差dt相同的待优化的未知变量,边作为连接优化变量之间的残差公式,代入优化方程并且进行优化,计算出时差dt。
本实施例通过计算出相机与IMU的时差,消除了相机与IMU之间的时间戳偏差,提高了视觉惯性里程计的计算精度。
下面对本申请提供的时间戳校正装置进行描述,下文描述的时间戳校正装置与上文描述的时间戳校正方法可相互对应参照。
请参照图3,本申请还提供一种时间戳校正装置,包括:
运动数据提取模块10,用于获取相机图像,选取所述相机图像中的关键帧图像,提取相邻关键帧图像之间的运动数据;
预积分模块20,用于对所述运动数据进行预积分,得到所述相邻关键帧图像的相对位姿,以及所述相对位姿对应的运动残差;
时差计算模块30,用于根据所述关键帧图像对应的路标点和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差;
时间戳校正模块40,用于根据所述时差对所述关键帧图像的时间戳和所述运动数据的时间戳进行校正。
可知地,所述运动数据提取模块,包括:
图像获取单元,用于根据第一预设采集频率和预设时段获取相机图像,根据预设规则选取所述相机图像中的关键帧图像;
IMU数据获取单元,用于根据第二预设采集频率和所述预设时段获取惯性传感器IMU数据;
运动数据提取单元,用于根据所述IMU数据提取相邻关键帧图像之间的运动数据。
可知地,所述预积分模块,包括:
预积分单元,用于对所述运动数据进行预积分,得到所述相邻关键帧图像的相对位姿,其中,所述相对位姿包括位置差、速度差和转角差;
运动残差计算单元,用于根据所述位置差、所述速度差和所述转角差计算运动残差。
可知地,所述时差计算模块,包括:
重投影误差计算单元,用于根据所述关键帧图像对应的路标点,计算所述关键帧图像与所述路标点之间的重投影误差;
第一时差计算单元,用于根据所述重投影误差和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差。
可知地,所述第一时差计算单元,包括:
信息矩阵获取单元,用于获取所述相对位姿对应的第一信息矩阵,以及所述重投影误差对应的第二信息矩阵;
优化方程确定单元,用于根据所述重投影误差、所述运动残差、所述第一信息矩阵以及所述第二信息矩阵,确定优化方程;
第二时差计算单元,用于根据所述优化方程,计算所述关键帧图像与所述运动数据之间的时差。
可知地,所述第二时差计算单元,包括:
第三时差计算单元,用于将所述关键帧图像的顶点坐标和边特征点代入所述优化方程,计算得到所述关键帧图像与所述运动数据之间的时差。
图4示例了一种电子设备的实体结构示意图,如图4所示,该电子设备可以包括:处理器(processor)410、通信接口(Communications Interface)420、存储器(memory)430和通信总线440,其中,处理器410,通信接口420,存储器430通过通信总线440完成相互间的通信。处理器410可以调用存储器430中的逻辑指令,以执行时间戳校正方法,该方法包括:获取相机图像,选取所述相机图像中的关键帧图像,提取相邻关键帧图像之间的运动数据;对所述运动数据进行预积分,得到所述相邻关键帧图像的相对位姿,以及所述相对位姿对应的运动残差;根据所述关键帧图像对应的路标点和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差;根据所述时差对所述关键帧图像的时间戳和所述运动数据的时间戳进行校正。
此外,上述的存储器430中的逻辑指令可以通过软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
另一方面,本申请还提供一种计算机程序产品,所述计算机程序产品包括计算机程序,计算机程序可存储在非暂态计算机可读存储介质上,所述计算机程序被处理器执行时,计算机能够执行上述各方法所提供的时间戳校正方法,该方法包括:获取相机图像,选取所述相机图像中的关键帧图像,提取相邻关键帧图像之间的运动数据;对所述运动数据进行预积分,得到所述相邻关键帧图像的相对位姿,以及所述相对位姿对应的运动残差;根据所述关 键帧图像对应的路标点和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差;根据所述时差对所述关键帧图像的时间戳和所述运动数据的时间戳进行校正。
又一方面,本申请还提供一种非暂态计算机可读存储介质,其上存储有计算机程序,该计算机程序被处理器执行时实现以执行上述各方法提供的时间戳校正方法,该方法包括:获取相机图像,选取所述相机图像中的关键帧图像,提取相邻关键帧图像之间的运动数据;对所述运动数据进行预积分,得到所述相邻关键帧图像的相对位姿,以及所述相对位姿对应的运动残差;根据所述关键帧图像对应的路标点和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差;根据所述时差对所述关键帧图像的时间戳和所述运动数据的时间戳进行校正。
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性的劳动的情况下,即可以理解并实施。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到各实施方式可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件。基于这样的理解,上述技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在计算机可读存储介质中,如ROM/RAM、磁碟、光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行各个实施例或者实施例的某些部分所述的方法。
最后应说明的是:以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围。

Claims (10)

  1. 一种时间戳校正方法,包括:
    获取相机图像,选取所述相机图像中的关键帧图像,提取相邻关键帧图像之间的运动数据;
    对所述运动数据进行预积分,得到所述相邻关键帧图像的相对位姿,以及所述相对位姿对应的运动残差;
    根据所述关键帧图像对应的路标点和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差;
    根据所述时差对所述关键帧图像的时间戳和所述运动数据的时间戳进行校正。
  2. 根据权利要求1所述的时间戳校正方法,其中所述获取相机图像,选取所述相机图像中的关键帧图像,提取相邻关键帧图像之间的运动数据的步骤包括:
    根据第一预设采集频率和预设时段获取相机图像,根据预设规则选取所述相机图像中的关键帧图像;
    根据第二预设采集频率和所述预设时段获取惯性传感器IMU数据;
    根据所述IMU数据提取相邻关键帧图像之间的运动数据。
  3. 根据权利要求1所述的时间戳校正方法,其中所述对所述运动数据进行预积分,得到所述相邻关键帧图像的相对位姿,以及所述相对位姿对应的运动残差的步骤包括:
    对所述运动数据进行预积分,得到所述相邻关键帧图像的相对位姿,其中,所述相对位姿包括位置差、速度差和转角差;
    根据所述位置差、所述速度差和所述转角差计算运动残差。
  4. 根据权利要求1所述的时间戳校正方法,其中所述根据所述关键帧图像对应的路标点和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差的步骤包括:
    根据所述关键帧图像对应的路标点,计算所述关键帧图像与所述路标点之间的重投影误差;
    根据所述重投影误差和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差。
  5. 根据权利要求4所述的时间戳校正方法,其中所述根据所述重投影误差和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差的步骤包括:
    获取所述相对位姿对应的第一信息矩阵,以及所述重投影误差对应的第二信息矩阵;
    根据所述重投影误差、所述运动残差、所述第一信息矩阵以及所述第二信息矩阵,确定优化方程;
    根据所述优化方程,计算所述关键帧图像与所述运动数据之间的时差。
  6. 根据权利要求5所述的时间戳校正方法,其中所述根据所述优化方程,计算所述关键帧图像与所述运动数据之间的时差的步骤包括:
    将所述关键帧图像的顶点坐标和边特征点代入所述优化方程,计算得到所述关键帧图像与所述运动数据之间的时差。
  7. 一种时间戳校正装置,包括:
    运动数据提取模块,用于获取相机图像,选取所述相机图像中的关键帧图像,提取相邻关键帧图像之间的运动数据;
    预积分模块,用于对所述运动数据进行预积分,得到所述相邻关键帧图像的相对位姿,以及所述相对位姿对应的运动残差;
    时差计算模块,用于根据所述关键帧图像对应的路标点和所述运动残差,计算所述关键帧图像与所述运动数据之间的时差;
    时间戳校正模块,用于根据所述时差对所述关键帧图像的时间戳和所述运动数据的时间戳进行校正。
  8. 一种电子设备,包括存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,其中所述处理器执行所述程序时实现如权利要求1至6任一项所述时间戳校正方法。
  9. 一种非暂态计算机可读存储介质,其上存储有计算机程序,其中所述计算机程序被处理器执行时实现如权利要求1至6任一项所述时间戳校正方法。
  10. 一种计算机程序产品,包括计算机程序,其中所述计算机程序被处理器执行时实现如权利要求1至6任一项所述时间戳校正方法。
PCT/CN2022/103380 2022-05-24 2022-07-01 时间戳校正方法、装置、设备、介质及计算机程序产品 WO2023226156A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210576177.0A CN115239758A (zh) 2022-05-24 2022-05-24 时间戳校正方法、装置、设备、介质及计算机程序产品
CN202210576177.0 2022-05-24

Publications (1)

Publication Number Publication Date
WO2023226156A1 true WO2023226156A1 (zh) 2023-11-30

Family

ID=83667803

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/103380 WO2023226156A1 (zh) 2022-05-24 2022-07-01 时间戳校正方法、装置、设备、介质及计算机程序产品

Country Status (2)

Country Link
CN (1) CN115239758A (zh)
WO (1) WO2023226156A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116304992A (zh) * 2023-05-22 2023-06-23 智道网联科技(北京)有限公司 传感器时差确定方法、装置、计算机设备和存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130335554A1 (en) * 2012-06-14 2013-12-19 Qualcomm Incorporated Adaptive estimation of frame time stamp latency
CN107869989A (zh) * 2017-11-06 2018-04-03 东北大学 一种基于视觉惯导信息融合的定位方法及系统
CN108629793A (zh) * 2018-03-22 2018-10-09 中国科学院自动化研究所 使用在线时间标定的视觉惯性测程法与设备
CN110246147A (zh) * 2019-05-14 2019-09-17 中国科学院深圳先进技术研究院 视觉惯性里程计方法、视觉惯性里程计装置及移动设备
CN112907633A (zh) * 2021-03-17 2021-06-04 中国科学院空天信息创新研究院 动态特征点识别方法及其应用
CN114216455A (zh) * 2021-11-04 2022-03-22 天津工业大学 同时优化时间偏差的视觉惯性里程计系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130335554A1 (en) * 2012-06-14 2013-12-19 Qualcomm Incorporated Adaptive estimation of frame time stamp latency
CN107869989A (zh) * 2017-11-06 2018-04-03 东北大学 一种基于视觉惯导信息融合的定位方法及系统
CN108629793A (zh) * 2018-03-22 2018-10-09 中国科学院自动化研究所 使用在线时间标定的视觉惯性测程法与设备
CN110246147A (zh) * 2019-05-14 2019-09-17 中国科学院深圳先进技术研究院 视觉惯性里程计方法、视觉惯性里程计装置及移动设备
CN112907633A (zh) * 2021-03-17 2021-06-04 中国科学院空天信息创新研究院 动态特征点识别方法及其应用
CN114216455A (zh) * 2021-11-04 2022-03-22 天津工业大学 同时优化时间偏差的视觉惯性里程计系统

Also Published As

Publication number Publication date
CN115239758A (zh) 2022-10-25

Similar Documents

Publication Publication Date Title
CN109307508B (zh) 一种基于多关键帧的全景惯导slam方法
CN110084832B (zh) 相机位姿的纠正方法、装置、系统、设备和存储介质
CN110009681B (zh) 一种基于imu辅助的单目视觉里程计位姿处理方法
CN112304307A (zh) 一种基于多传感器融合的定位方法、装置和存储介质
CN109506642B (zh) 一种机器人多相机视觉惯性实时定位方法及装置
US8698875B2 (en) Estimation of panoramic camera orientation relative to a vehicle coordinate frame
CN114018274B (zh) 车辆定位方法、装置及电子设备
CN113406682B (zh) 一种定位方法、装置、电子设备及存储介质
WO2020140431A1 (zh) 相机位姿确定方法、装置、电子设备及存储介质
CN110260861B (zh) 位姿确定方法及装置、里程计
US20220051031A1 (en) Moving object tracking method and apparatus
CN113551665B (zh) 一种用于运动载体的高动态运动状态感知系统及感知方法
CN110388919B (zh) 增强现实中基于特征图和惯性测量的三维模型定位方法
CN110660098A (zh) 基于单目视觉的定位方法和装置
CN111609868A (zh) 一种基于改进光流法的视觉惯性里程计方法
WO2023226156A1 (zh) 时间戳校正方法、装置、设备、介质及计算机程序产品
JP2023021994A (ja) 自動運転車両に対するデータ処理方法及び装置、電子機器、記憶媒体、コンピュータプログラム、ならびに自動運転車両
CN111595332B (zh) 一种融合惯性技术与视觉建模的全环境定位方法
CN113066127A (zh) 一种在线标定设备参数的视觉惯性里程计方法和系统
CN113610702B (zh) 一种建图方法、装置、电子设备及存储介质
CN113744308A (zh) 位姿优化方法、装置、电子设备、介质及程序产品
CN114440877B (zh) 一种异步多相机视觉惯性里程计定位方法
CN115727871A (zh) 一种轨迹质量检测方法、装置、电子设备和存储介质
WO2020019116A1 (zh) 多源数据建图方法、相关装置及计算机可读存储介质
CN115900697A (zh) 对象运动轨迹信息处理方法、电子设备以及自动驾驶车辆

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22943347

Country of ref document: EP

Kind code of ref document: A1