Tensors and Dynamic neural networks in Python with strong GPU acceleration
-
Updated
Jun 5, 2025 - Shell
Tensors and Dynamic neural networks in Python with strong GPU acceleration
A Tensor module that allows a deep learning framework to switch seamlessly between different engines.
Understanding neural network libraries and the automatic gradient computations (autograd) in the backward pass
A neural network library built around an Automatic Differentiation system written from scratch. The main focus of this project is its 'AutoGrad' system.
Learn the fundamentals of PyTorch in this hands-on lab from MIT’s Deep Learning course. You'll explore tensors, build neural networks, and implement gradient descent — all key building blocks for deep learning.
A simple Autograd engine written in Crystal
Re-implementation of micrograd by Andrej karpathy for learning purposes. Pytorch api like autograd and neural net library.
RUNE: RUsty Neural Engine
Solving Schrodinger's Equation with a Neural Network using numerical integration and autograd. Check https://arxiv.org/abs/2104.04795
Learn PyTorch basics with tensors, GPU checks, and tensor creation. Build a strong foundation for deep learning. 🌟 #GitHub
A minimal Java neural network library with autograd, MLP, and classification examples – built from scratch.
demo repository containing the experiments for my master's seminar @ TUM
Space-Time Attention with a Shifted Non-Local Search
Micrograd is an autograd built from scratch
A pedagogical implementation of Automatic Differation on multi-dimensional tensors.
Simple and basic , tiny autograd engine
A Small(Quant) Auto Grad Library with visualization of individual operations within Neurons
Add a description, image, and links to the autograd topic page so that developers can more easily learn about it.
To associate your repository with the autograd topic, visit your repo's landing page and select "manage topics."