Skip to content

Calibrating your sensors#

Overview#

Autoware expects to have multiple sensors attached to the vehicle as input to perception, localization, and planning stack. Autoware uses fusion techniques to combine information from multiple sensors. For this to work effectively, all sensors must be calibrated properly to align their coordinate systems, and their positions must be defined using either urdf files (as in sample_sensor_kit) or as tf launch files. In this documentation, we will explain TIER IV's CalibrationTools repository for the calibration process. Please look at Starting with TIER IV's CalibrationTools page for installation and usage of this tool.

If you want to look at other calibration packages and methods, you can check out the following packages.

Other packages you can check out#

Camera calibration#

Intrinsic Calibration#

Camera calibration tools provided by TIER IV#

At TIER IV, we provide two types of calibration tools for camera calibration:

Lidar-lidar calibration#

Lidar-lidar Calibration tool provided by Autocore#

LL-Calib on GitHub, provided by AutoCore, is a lightweight toolkit for online/offline 3D LiDAR to LiDAR calibration. It's based on local mapping and "GICP" method to derive the relation between main and sub lidar. Information on how to use the tool, troubleshooting tips and example rosbags can be found at the above link.

Lidar-lidar calibration tool provided by TIER IV#

TIER IV provides the following two types of calibration tools for performing LiDAR–LiDAR calibration:

Lidar-camera calibration#

Developed by MathWorks, The Lidar Camera Calibrator app enables you to interactively estimate the rigid transformation between a lidar sensor and a camera.

https://ww2.mathworks.cn/help/lidar/ug/get-started-lidar-camera-calibrator.html

SensorsCalibration toolbox v0.1: One more open source method for Lidar-camera calibration. This is a project for LiDAR to camera calibration,including automatic calibration and manual calibration

https://github.com/PJLab-ADG/SensorsCalibration/blob/master/lidar2camera/README.md

Developed by AutoCore, an easy-to-use lightweight toolkit for Lidar-camera-calibration is proposed. Only in three steps, a fully automatic calibration will be done.

https://github.com/autocore-ai/calibration_tools/tree/main/lidar-cam-calib-related

Lidar-camera calibration with tools provided by TIER IV#

TIER IV provides the following calibration tools for performing LiDAR–Camera calibration:

Lidar-IMU calibration#

Developed by APRIL Lab at Zhejiang University in China, the LI-Calib calibration tool is a toolkit for calibrating the 6DoF rigid transformation and the time offset between a 3D LiDAR and an IMU, based on continuous-time batch optimization. IMU-based cost and LiDAR point-to-surfel (surfel = surface element) distance are minimized jointly, which renders the calibration problem well-constrained in general scenarios.

AutoCore has forked the original LI-Calib tool and overwritten the Lidar input for more general usage. Information on how to use the tool, troubleshooting tips and example rosbags can be found at the LI-Calib fork on GitHub.

Base-lidar calibration#

Base-lidar calibration with tools provided by TIER IV#

TIER IV provides the following two types of calibration tools for performing Base–LiDAR calibration:

Other calibration tools provided by TIER IV#

In addition to sensor calibration, TIER IV also develops calibration tools for localization and control, which are released as open-source software. Please refer to the other calibration tools here.