© TUGraz/IGI

Welcome to the Institute of Theoretical Computer Science at TU Graz

The Institute of Theoretical Computer Science was founded in 1992 to investigate fundamental problems in information processing such as the design of computer algorithms, the complexity of computations and computational models, automated knowledge acquisition (machine learning), the complexity of learning algorithms, pattern recognition with artificial neural networks, computational geometry, and information processing in biological neural systems.

Its research integrates methods from mathematics, computer science and computational neuroscience.

In education this institute is responsible for courses and seminars that introduce students into the basic techniques and results of theoretical computer science. In addition it offers advanced courses, seminars and applied computer projects in computational geometry, computational complexity theory, machine learning, and neural networks.


Paper accepted at ICML

June 2021: Want to train sparsely connected neural network which is robust to adversarial attacks? Use our novel method developed in the Dependable-Systems-Lab of the SiliconAustriaLabs

Ozan Özdenizci and Robert Legenstein.
Training adversarially robust sparse networks via Bayesian connectivity sampling. In
Proceedings of the 38th International Conference on Machine Learning, Marina Meila and Tong Zhang, editors, volume 139 of
Proceedings of Machine Learning Research, pages 8314-8324. PMLR, 18-24 Jul 2021. (Link to PDF)

Congratulations Ozan.

New research article featured in the news

June 2021: How to control a robotic elephant trunk with a Spiking Neural Network

Manuel Traub, Robert Legenstein, and Sebastian Otte.
Many-joint robot arm control with recurrent spiking neural networks. In
2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2021. accepted for publication. (Link to ArXiv PDF)

See the news features:

Article featured in Nature Communication Editor's Highlights

Our article on biologically plausible backprop through time " is featured in Nature Communication Editor's Highlights:


FET-Open project ADOPD started in October

Oct 2020: The EU FET-Open project ADOPD (Adaptive Optical Dendrites) has started. See Projects.

In media (German):

Paper accepted for NeurIPS 2020

Thomas Limbacher and Robert Legenstein.
H-mem: Harnessing synaptic plasticity with Hebbian memory networks

accpted for NeurIPS 2020
bioRxiv, 2020. (Link to PDF)

New article in Nature Communications

Aug 2020: The IGI team has published a new article in Nature Communications: A solution to the learning dilemma for recurrent networks of spiking neurons


New reserach from the institute

June 2020: How do neurons in the brain interact to create the mind? IGI researchers tackled this question in a series of papers in PNAS, CerebralCortex, eNeuro. Read more...

News & Views Article on our research

May 2020: Nature Machine Intelligence published this News & Views article on our research: An alternative to backpropagation through time.

SMALL Project has started

Mar 2020: The EU FET-Proactive project SMALL (Spiking Memristive Architectures for Learning to Learn) has started. See Projects and the Webpage.

Wolfgang Maass on Braininspired.co - Podcast

Feb 2020: Don't miss these interviews with Wolfgang Maass on computational neuroscience, brain-inspired computation, and more at

Cooperation between TU Graz and Silicon Austria Labs has been started

Jan. 2020: On 10th of January, a cooperation between TU Graz and the Silicon Austria Labs has been started with a press conference and official signatures under a contract that establishes two joint research labs. The goal of the cooperation is to foster basic research for future microelectronics. The Institute of Theoretical Computer Science is part of the Depenendable Embedded Systems Lab, where research will focus on more reliable AI systems. Press release

New paper

Jan. 2019: Our new paper discusses possibilities how powerful learning algorithms could be implemented in biological neuronal networks. Bellec, Scherr, et al., Biologically inspired alternatives to backpropagation through time for learning in recurrent neural nets. ArXiv 2019.

Dec. 2018: The IGI cooperates with intel on the design of spiking neural networks for their neuromorphic chip Loihi. Article at top500...

ELLIS Society launched

Dec. 2018: Prof. Robert Legenstein and Prof. Wolfgang Maass at the founding ceremony of the ELLIS society. ELLIS is an initiative of European scientists to establish a European Lab for Learning & Intelligent Systems (ELLIS open letter). more...