Julia Implementation of Transformer models
-
Updated
Aug 2, 2025 - Julia
Julia Implementation of Transformer models
Memory, Attention and Composition (MAC) Network for CLEVR implemented via KnetLayers
Implementation of Single Headed Attention - Recurrent Neural Networks in Julia and Knet
Model implementation for "Adaptive computation as a new mechanism of dynamic human attention"
A Julia package providing modular and extensible attention mechanisms for deep learning models.
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."