Development of a data logging, processing and visualization platform for autonomous driving

Real-life data is needed to develop and validate algorithms for many different applications that are becoming more autonomous, ranging from logistics, robotics, agricultural or commercial vehicle applications. A small footprint data capturing system that can easily be added to these applications will give developers a head start, by providing easy access to real-life data. Ideally, this platform can in a next step also be used to do preliminary tests with the developed algorithms. By including a visualization of the (processed) data, engineers will be able to assess if the data capturing and processing is performing correctly.

GOAL OF THE INTERNSHIP The goal of this internship is to develop a data logging, processing and visualization platform for autonomous driving with a small footprint, by using RTMaps software. RTMaps is an efficient and easy-to-use framework for fast and robust developments in a multi-sensor environment ( -> ‘Simulink alike’). The block based programming used in RTMaps can significantly speed up the development of an application.Both a Lidar and camera will be connected to the platform. The model you will make in RTMaps to capture this camera and Lidar data, will have to be deployed on a Nvidia Jetson. You will investigate the different steps towards deployment on an Nvidia Jetson. Afterwards, this Nvidia Jetson will be integrated in a stand-alone hardware box, selecting the necessary components (battery, connectors, storage, box, ...) and assembling it. The box will be put on a mobile platform to do a data capturing campaign. In a final step, you will develop and test with RTmaps the online processing of the Lidar and/or camera data and visualize it (i.e. perform basic obstacle detection) and benchmark it to an available implementation. The processed data will be outputted over ROS2 to create a stand-alone processing unit for one or two sensors, that can communicate with other applications over ROS2. To guarantee a small footprint, only a limited set of sensors (Camera/Lidar) will have to be supported by the platform and the number of concurrently processed sensors will be limited (<=2).

This internship includes both hardware and software development. Part of the tasks can be omitted, dependent on the length of the assignment.

Learning target: You will gain a lot of experience with working with hardware and software commonly used in autonomous driving:

  • sSensors (Camera, Lidar)
  • Nvidia Jetson
  • Software: RTmaps, ROS2, Python, Linux

Computer Science Engineering




Flemish Brabant


Leuven or Lommel


PRACTICAL DATA This assignment is an internship but can also be executed by a thesis student from a Belgian university.The assignment is for min 3 month to maximum 6 months and takes place at the offices of Flanders Make offices located in Lommel or Leuven, Belgium.

For internship:All software and hardware needed for the execution of the project will be provided by Flanders Make.

Gezocht Profiel
  • Bachelor Degree in computer sciences, electrical, control or mechatronic engineering;
  • Experience with Matlab/Simulink and/or Python is required.
  • Knowledge of electronics hardware (Camera/Lidar/NAS/Nvidia GPU) and experience with electronic tinkering is recommended;
  • Knowledge of Linux, ROS and C++ is a plus.
  • Passionate by research and new technologies with focus on applications for autonomous driving and robotics
  • Result oriented, responsible and proactive;
  • A good communicator, able to communicate in English;
  • Eager to learn and a team player.

Only EEA or Swiss nationals can be accepted for internships due to work permit regulations.