I was a post-doc at the Institut für Neuroinformatik at the Ruhr-Universität Bochum. I finished my PhD with Prof. Wolfgang Maass at the Institute for Theoretical Computer Science at Technische Universität Graz, working on biologically plausible learning and meta-learning in spiking neural networks.
I have a Masters in computer science from the University of Texas at Austin where I worked with Prof. Risto Miikkulainen on using neuro-evolution and task-decomposition to learn complex tasks. I worked as a Software Development Engineer at Amazon.com in the DynamoDB team for a couple of years right after my Masters.
My Erdős number is 3.
I am broadly interested in learning and intelligence, both algorithmic and biological. My current research interests include:
- scalable and energy efficient machine learning
- understanding and developing algorithms for continual learning
- creating algorithms uniquely suited for recurrent neural networks
A lot of my work derives inspiration from neuroscience and biology in the quest to build a better and more general artificial intelligence.
If you’re interested in starting a collaboration, don’t hesitate to reach out and get in touch
I’m happy to support your application for externally funded post-doc fellowships for working with me.
For those interested in externally funded PhD programs, internships, or remote internships, feel free to email me to discuss the possibilities.
Stay tuned, as I’ll be regularly updating the site with other open positions as they become available.
I commit a few hours per month to mentor underrepresented groups in academia. If you need guidance on career choices, research directions, PhD applications, or anything else, please book a slot on my calendar to talk to me.
* denotes equal contributions
- Jain A, Subramoney A, Miikkulainen R. "Task decomposition with neuroevolution in extended predator-prey domain". In: Proceedings of Thirteenth International Conference on the Synthesis and Simulation of Living Systems. East Lansing, MI, USA; 2012. conference (url) (pdf) (bibtex)
- Petrovici MA, Schmitt S, Klähn J, Stöckel D, Schroeder A, Bellec G, Bill J, Breitwieser O, Bytschok I, Grübl A, Güttler M, Hartel A, Hartmann S, Husmann D, Husmann K, Jeltsch S, Karasenko V, Kleider M, Koke C, Kononov A, Mauch C, Müller E, Müller P, Partzsch J, Pfeil T, Schiefer S, Scholze S, Subramoney A, Thanasoulis V, Vogginger B, Legenstein R, Maass W, Schüffny R, Mayr C, Schemmel J, Meier K. "Pattern representation and recognition with accelerated analog neuromorphic systems". In: 2017 IEEE International Symposium on Circuits and Systems (ISCAS). 2017. p. 1–4. conference (url) (pdf) (bibtex)
- Kaiser J, Stal R, Subramoney A, Roennau A, Dillmann R. "Scaling up liquid state machines to predict over address events from dynamic vision sensors". Bioinspiration & Biomimetics. June 2017; journal (url) (pdf) (bibtex)
- Bellec* G, Salaj* D, Subramoney* A, Legenstein R, Maass W. "Long short-term memory and Learning-to-learn in networks of spiking neurons". In: Advances in Neural Information Processing Systems 31. Curran Associates, Inc.; 2018. p. 795–805. conference (url) (pdf) (bibtex)
- Kaiser* J, Hoff* M, Konle A, Vasquez Tieck JC, Kappel D, Reichard D, Subramoney A, Legenstein R, Roennau A, Maass W, Dillmann R. "Embodied Synaptic Plasticity With Online Reinforcement Learning". Frontiers in Neurorobotics. 2019;13:81. journal (url) (bibtex)
- Subramoney A, Scherr F, Maass W. "Learning to learn motor prediction by networks of spiking neurons". In: Worshop on Robust Artificial Intelligence For Neurorobotics, Edinburgh. 2019. workshop (url) (bibtex)
- Bellec* G, Scherr* F, Subramoney A, Hajek E, Salaj D, Legenstein R, Maass W. "A solution to the learning dilemma for recurrent networks of spiking neurons". Nature Communications. July 2020;11(1):3625. journal (url) (preprint) (bibtex)
- Subramoney A, Scherr F, Maass W. "Reservoirs Learn to Learn". In: Nakajima K, Fischer I, editors. Reservoir Computing: Theory, Physical Implementations, and Applications. Singapore: Springer; 2021. p. 59–76. (Natural Computing Series). bookchapter (url) (preprint) (bibtex)
- Subramoney A, Bellec G, Scherr F, Legenstein R, Maass W. "Revisiting the role of synaptic plasticity and network dynamics for fast learning in spiking neural networks". bioRxiv. January 2021; preprint (preprint) (bibtex)
- Rao* A, Legenstein* R, Subramoney A, Maass W. "A normative framework for learning top-down predictions through synaptic plasticity in apical dendrites". bioRxiv. March 2021; preprint (preprint) (bibtex)
- Salaj* D, Subramoney* A, Kraišniković* C, Bellec G, Legenstein R, Maass W. "Spike Frequency Adaptation Supports Network Computations on Temporally Dispersed Information". eLife. July 2021;10:e65459. journal (url) (preprint) (bibtex)
- Yegenoglu A, Subramoney A, Hater T, Jimenez-Romero C, Klijn W, Pérez Martı́n Aarón, Vlag M van der, Herty M, Morrison A, Diaz Pier S. "Exploring parameter and hyper-parameter spaces of neuroscience models on high performance computers with Learning to Learn". Frontiers in Computational Neuroscience. May 2022;:46. journal (url) (preprint) (bibtex)
- Subramoney A, Nazeer KK, Schöne M, Mayr C, Kappel D. "Efficient Recurrent Architectures through Activity Sparsity and Sparse Back-Propagation through Time". In: International Conference on Learning Representations. 2023. conference Spotlight (notable-top-25%) presentation (url) (preprint) (talk) (bibtex)
- Subramoney A. "Efficient Real Time Recurrent Learning through Combined Activity and Parameter Sparsity". In: ICLR 2023 Workshop: Sparsity in Neural Networks (SNN). arXiv; 2023. workshop (url) (preprint) (bibtex)
- Kappel D, Nazeer KK, Fokam CT, Mayr C, Subramoney A. "Block-local learning with probabilistic latent representations". arXiv; 2023. preprint (preprint) (bibtex)
- Grappolini EW, Subramoney A. "Beyond Weights: Deep learning in Spiking Neural Networks with pure synaptic-delay training". In: Proceedings of the 2023 International Conference on Neuromorphic Systems. New York, NY, USA: Association for Computing Machinery; 2023. (ICONS ’23). conference (url) (preprint) (bibtex)
- Mukherji R, Schöne M, Nazeer KK, Mayr C, Subramoney A. "Activity Sparsity Complements Weight Sparsity for Efficient RNN Inference". In: NeurIPS 2023 Workshop: ML with New Compute Paradigms (MLNCP). 2023. workshop (preprint) (bibtex)
- Subramoney A. "Evaluating Modular Neuroevolution in Robotic Keepaway Soccer" [Master's thesis]. [Austin, TX]: Department of Computer Science, The University of Texas at Austin; 2012. p. 54 pages. thesis (url) (pdf) (bibtex)
- Subramoney A. "Biologically plausible learning and meta-learning in recurrent networks of spiking neurons" [PhD thesis]. [Graz, Austria]: Institute for Theoretical Computer Science, Graz University of Technology; 2020. thesis (url) (pdf) (bibtex)
- Beyond Biologically Plausible Spiking Networks for Neuromorphic Computing (SNUFA Workshop 2022)
- General Purpose event-based architectures for deep learning (SNUFA Seminar series 2022)
- New learning methods for recurrent networks of spiking neurons (Telluride 2020)
- Long short-term memory and learning-to-learn in networks of spiking neurons (NeurIPS 2018)
Open Source Software
- Event-based Neural Networks
- A high-performance library for recording and storing simulation data
- Learning to Learn: Gradient-free Optimization framework
- Liquid State Machines in Python and NEST
- Live plotting using matplotlib
- Topics in Deep Learning for Sequence Processing – Masters Seminar: WS2021/22, WS2022/23
- Introduction to Python: SS2021
- Machine Learning – Design Practical: SS2019
- Computational Intelligence – Lecture: SS2019, SS2018, SS2017, SS2016
- Computational Intelligence – Practical: SS2019, SS2018, SS2017, SS2016, SS2015
- Computational Intelligence SEW/CS – Lecture: SS2019, SS2018, SS2017, SS2016
- Computational Intelligence SEW/CS – Practical: SS2019, SS2018, SS2017, SS2016, SS2015
- Autonomously Learning Systems – Design Practical: WS2016/17
- Machine Learning (Probabilistic Methods in Machine Learning) – Design Practical: WS2015/16
- Introduction to Computing – Teaching Assistant: Fall 2012