Online Temporal Calibration Based on Cross-Correlation and Rotation Constraints for Visual-Inertial Odometry

Online Temporal Calibration Based on Cross-Correlation and Rotation Constraints for Visual-Inertial Odometry

Introduction

Visual-inertial odometry (VIO) has become a critical technology in robotics and autonomous systems, enabling devices to estimate their motion by fusing data from cameras and inertial measurement units (IMUs). While cameras provide rich visual information for feature tracking and scene understanding, IMUs offer high-frequency measurements of acceleration and angular velocity, making them complementary sensors. However, a significant challenge in VIO systems is the temporal misalignment between camera and IMU data streams due to varying transmission delays, sensor processing times, and operating system overhead. This time offset can degrade the accuracy of motion estimation and, in extreme cases, cause system failure.

Existing approaches for temporal calibration often require hardware synchronization or rely on specific optimization frameworks, limiting their applicability. Some methods augment the state vector in extended Kalman filter (EKF)-based systems to estimate time offsets, while others use offline techniques that cannot adapt to real-time requirements. Additionally, many algorithms depend on initial estimates, which are often unknown in practice. To address these limitations, this paper presents an online temporal calibration algorithm that leverages cross-correlation and rotation constraints to estimate and correct time offsets between camera and IMU data streams without requiring prior knowledge or hardware modifications.

Problem Formulation and Algorithm Overview

The core challenge lies in aligning the timestamps of camera and IMU measurements when they arrive at the processing system with different delays. Suppose the IMU data arrives earlier than the corresponding camera frame due to varying latencies. In that case, the resulting time offset introduces errors in motion estimation since the sensor measurements are not temporally synchronized. The proposed algorithm estimates this offset by analyzing the relative motion derived from both sensors and correcting the IMU timestamps accordingly.

The algorithm consists of two main stages:

  1. Cross-Correlation-Based Initial Estimation: The angular velocities computed from camera motion and IMU measurements are compared using cross-correlation to obtain a coarse time offset estimate.
  2. Rotation-Constrained Refinement: The initial estimate is refined by enforcing consistency between the relative rotations derived from camera poses and IMU preintegration, optimizing an error function to minimize misalignment.

This two-step approach ensures robustness and accuracy, allowing the system to operate reliably even with significant time offsets.

Relative Pose and Angular Velocity Computation

IMU Preintegration for Relative Motion

IMU measurements provide high-frequency acceleration and angular velocity data, which can be integrated to estimate the sensor’s motion over short intervals. The IMU model accounts for biases and noise, and preintegration techniques are used to compute relative rotations between consecutive IMU readings. By integrating angular velocity measurements, the algorithm estimates the IMU’s orientation changes, which are later compared with camera-derived rotations.

Camera Motion Estimation

Using feature correspondences between consecutive camera frames, the algorithm computes the relative pose between images via epipolar geometry. The essential matrix is decomposed to extract the rotation and translation components. Since the translation component lacks scale in monocular systems, only the rotation is used for temporal calibration to avoid scale ambiguity issues. The relative rotation between frames is converted into an axis-angle representation, from which the camera’s angular velocity is derived under the assumption of constant motion between frames.

Time Offset Estimation

Cross-Correlation for Coarse Alignment

The angular velocities from the camera and IMU are interpolated to a common high frequency to improve alignment precision. Cross-correlation is then applied to the interpolated sequences to identify the time shift that maximizes their similarity. This method provides an initial estimate of the time offset but is limited by the interpolation resolution.

Rotation-Constrained Optimization

To refine the initial estimate, the algorithm enforces consistency between the relative rotations from the camera and IMU. The key insight is that if the time offset were correctly estimated, the rotations derived from both sensors should align. An error function is formulated to measure the discrepancy between these rotations, and gradient-based optimization techniques are used to minimize this error. The optimization adjusts the time offset to ensure that the IMU’s integrated rotation matches the camera’s estimated rotation, leading to a more precise calibration.

Experimental Validation

The algorithm was evaluated on the Blackbird dataset, which contains aggressive motion sequences captured by a quadrotor. Artificial time offsets were introduced to the IMU data to simulate synchronization errors, and the system’s performance was assessed by comparing the estimated trajectories against ground truth.

Results and Analysis

  1. Robustness to Large Time Offsets: Without calibration, the VIO system failed when the time offset exceeded ±10 ms. With the proposed algorithm, the system remained stable even with offsets up to ±50 ms.
  2. Trajectory Accuracy: The calibrated system produced trajectories that closely matched those obtained with perfectly synchronized data, demonstrating significant improvements over uncalibrated operation.
  3. Comparison with Existing Methods: The two-stage approach outperformed methods relying solely on cross-correlation or requiring initial offset guesses. Unlike prior work, the proposed algorithm does not depend on external initialization, making it more versatile.

Visualization of Angular Velocity Alignment

A key strength of the method is its ability to align the angular velocity profiles of the camera and IMU. Experimental results showed that after calibration, the IMU’s angular velocity curves closely followed those derived from the camera, confirming effective temporal synchronization.

Conclusion

This paper introduced an online temporal calibration algorithm for VIO systems that addresses the challenge of time misalignment between camera and IMU data streams. By combining cross-correlation for initial estimation and rotation constraints for refinement, the method achieves accurate synchronization without requiring prior knowledge or hardware modifications. Experimental results demonstrated its effectiveness in maintaining system robustness and trajectory accuracy under significant time offsets.

The algorithm’s independence from specific SLAM frameworks makes it widely applicable, enabling more reliable operation in real-world scenarios where sensor synchronization cannot be guaranteed. Future work may explore adaptive techniques to handle dynamic changes in time offsets and extend the method to multi-sensor systems.

doi.org/10.19734/j.issn.1001-3695.2024.03.0114

Was this helpful?

0 / 0