The Autoware Centers of Excellence is happy to anounce an incoming tutorial hosted at the IEEE Intelligent Vehicles Symposium in Anchorage, Alaska, USA June 4-7, 2023. This tutorial aims to introduce the open-source autonomous driving software Autoware.Universe. Speakers are invited from universities worldwide to give talks on various autonomous vehicle related topics and the application of Autoware in these fields. The tutorial will serve as a general introduction to Autoware and help the audience quickly get familiar with the installation process and get started. It will be split into 30- and 45-minute sessions, each covering a topic of interest. A detailed description for each session is provided below.


Professors, students, independent researchers, and industrial partners are welcome to attend. To fully enjoy its benefit, the audience should have a background in engineering, computer science, mathematics, robotics, or related fields. It is also helpful but not required to have an understanding of autonomous vehicles on either hardware or software.


June 4th, 2023   8:00am - 11:00am Alaska Time

8:00am - 8:20am     Introduction: Autoware Software and Development
8:20am - 9:00am     Session I: Autoware on Scaled Platforms
9:00am - 9:40am     Session II: Autoware on Custom delivery vehicles
9:40am - 10:20am   Session III: Autoware Development Environment
10:20am - 11:00am Session IV: Pointcloud and HD Maps generation for Autoware



Ryohsuke Mitsudome
Tier IV Inc

To be updated

Session I: Autoware on Scaled Platforms

Rahul Mangharam, Zhijie Qiao
University of Pennsylvania

This tutorial is about running the Autoware software stacks on the 1/10 platform F1Tenth Vehicle. The tutorial will introduce the integration of Autoware on F1Tenth and run features such as mapping, localization, planning, and control. The demonstration will cover both the simulation environment and the real car, which are easily to follow reproducable by the audience.

Session II: Autoware on Custom delivery vehicles

Krzysztof Walas, Amadeusz Szymko, Michal Nowicki
Poznan University of Technology

At PUT we are using Autoware on different scale platforms. Starting from F1/10th through small delivery vehicles to full scale electric golf carts. In our tutorial we will show the hardware setup for the vehicles and how it was integrated on different cars. The main output of the tutorial is to show how you can develop different autonomous platforms using Autoware and take advantage of the open-source software stack

Session III: Autoware Development Environment

Phillip Karle
Technical University of Munich

TUM currently builds up a new, comprehensive research environment for autonomous driving software. The environment consists of a level-5 capable vehicle, which is equipped with all common environment sensors and two high performance computers. Besides that, a Hardware - in-the-Loop Simulation for 3D perception and full-stack scenario simulation is built up. The simulation software combines CARLA for perception simulation with CommonRoad to spawn complex, interactive scenarios. By this a comprehensive simulation environment is created. The idea for the tutorial is to present TUM's development environment, i.e., the integration of the autoware software stack into the simulation environment with CommonRoad and CARLA. Additionally, the overall workflow from feature development up to real-world tests can be shown.

Session IV: Pointcloud and HD Maps generation for Autoware

Alexander Carballo
Nagoya University

Autoware relies on the availability of maps for the tasks of localization and planning. While other complementary sensors simplify initialization and accuracy, the main sensor used for localization is the 3D LiDAR. Consequently, 3D pointcloud maps of the target environment are an indispensable input. In addition, to navigate towards the destination following the rules of the target environment, such as speed limits, traffic lights, stop signs, etc., Autoware also relies on the existence of high definition (HD) maps. HD Maps go beyond the simple semantics of perception: the individual lanes, their direction, speed limits, association with traffic lights, interconnectivity at junctions, surface friction; the status of traffic lights and other regulatory elements; information about traffic accidents and congestion, and much more. Autoware uses HD Maps as a virtual sensor in order to achieve the path planning tasks of mission and motion. This tutorial aims to explain, step by step, how to create pointcloud maps and also Lanelet2 HD Maps.