Skip to content

SambaranRepo/Visual-Inertial-SLAM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

VisualIntertialSLAM

About

This project implemented a Bayes Filter to solve the Simultaneous Localization and Mapping(SLAM) problem on a robot moving in an initially unknown environment. Specifically we implemented a Visual Inertial Extended Kalman Filter that uses a gaussian distribution estimate the robot pose and landmark positions at any given time. The map consists of the collection of these landmark feature points.

Environment Setup

  • To replicate the project, first build a conda environment using the provided ece276a.yaml file as follows :
 $conda env create -f ece276a.yaml
  • Finally activate the conda environment
 $conda activate ece276a 

File Details

Files are structered in the code folder.

code
├── data
│   ├── 03.npz
│   └── 10.npz
├── ece276a.yaml
├── pr3_utils.py
├── README.md
├── requirements.txt
├── visual_slam.py
└── vslam.py

a. vslam.py

This script is used to just run the IMU EKF predict step and the landmark update step corresponding to Problems (a) and (b) in the Project Guidelines pdf. Usage :-

 $python3 code/vslam.py 

b. visual_slam.py

This script aims to run the full Visual Inertial SLAM algorithm on the given datasets by running the update step for both the IMU pose and the landmarks at once. Usage :-

 $python3 code/visual_slam.py 

Technical Report

Results

Independent IMU predict and Landmark Updates

Data 03.npz

Data 10.npz

Visual SLAM

Data 03.npz

Data 10.npz

About

Visual Inertial SLAM of a robot

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages