Skip to content (Press Enter)
DataMachine LearningRobotics

Simultaneous Localization and Mapping (SLAM)

In the field of robotics, Simultaneous Localization and Mapping (SLAM) is the problem of constructing and constantly updating a map of an environment that is unknown to the robot while at the same time keeping track of the robot’s location in that map.

In this project, I developed several algorithms to implement the SLAM technique for a mobile robot in an indoor environment. For this, I used odometry, inertial and range measurements that had been previously collected using the robot. The LIDAR used to connect these measurements is Hokuyo UTM-30LX. The sensors the robot used to collect these measurements include wheel encoders, a Light Detection and Ranging sensor (LIDAR), and an Inertial Measurement Unit (IMU).

The project has two main parts. For the first part, I estimated the map and robot position using the dead-reckoning technique. For the second part, I implemented a Particle Filter (PF) with systematic re-sampling to implement SLAM and improve the results obtained in the previous part.

For this project I used Python. I avoided using any SLAM libraries, so I can confidently say that I now understand how the algorithms work through and through!

Check out a few videos for different environments the robot visited, and click on the button to the project repository to read the report!

Project Repository

Related Projects

Speech Denoising with Deep Nets

May 11, 2020

Chair Posture Tracker

January 6, 2020

Gesture Recognition

April 8, 2020
© 2020 Copyright Eva Esteban.