Skip to content

Calibrating your sensors#

Overview#

Autoware expects to have multiple sensors attached to the vehicle as input to perception, localization, and planning stack. These sensors must be calibrated correctly and their positions must be defined using either urdf files (as in sample_sensor_kit) or as tf launch files.

Camera calibration#

Intrinsic Calibration#

Lidar-lidar calibration#

Lidar-Lidar Calibration tool from Autocore#

LL-Calib on Github, provided by AutoCore, is a lightweight toolkit for online/offline 3D LiDAR to LiDAR calibration. It's based on local mapping and "GICP" method to derive the relation between main and sub lidar. Information on how to use the tool, troubleshooting tips and example rosbags can be found at the above link.

Lidar-camera calibration#

Developed by MathWorks, The Lidar Camera Calibrator app enables you to interactively estimate the rigid transformation between a lidar sensor and a camera.

https://ww2.mathworks.cn/help/lidar/ug/get-started-lidar-camera-calibrator.html

SensorsCalibration toolbox v0.1: One more open source method for Lidar-camera calibration. This is a project for LiDAR to camera calibration,including automatic calibration and manual calibration

https://github.com/PJLab-ADG/SensorsCalibration/blob/master/lidar2camera/README.md

Lidar-IMU calibration#

Developed by APRIL Lab at Zhejiang University in China, the LI-Calib calibration tool is a toolkit for calibrating the 6DoF rigid transformation and the time offset between a 3D LiDAR and an IMU, based on continuous-time batch optimization. IMU-based cost and LiDAR point-to-surfel (surfel = surface element) distance are minimized jointly, which renders the calibration problem well-constrained in general scenarios.

AutoCore has forked the original LI-Calib tool and overwritten the Lidar input for more general usage. Information on how to use the tool, troubleshooting tips and example rosbags can be found at the LI-Calib fork on Github.