* – equal contributions


A contrastive rule for meta-learning
N Zucchet*, S Schug*, J von Oswald*, D Zhao, J Sacramento
NeurIPS 2022
Preprint Code Tweet

Random initialisations performing above chance and how to find them
F Benzing, S Schug, R Meier, J von Oswald, Y Akram, N Zucchet, L Aitchison, A Steger
OPT2020 workshop at NeurIPS 2022
Preprint Code Tweet


Presynaptic stochasticity improves energy efficiency and helps alleviate the stability-plasticity dilemma
S Schug*, F Benzing*, A Steger
eLife 10: e69884
Paper Preprint Code Tweet

Learning where to learn: Gradient sparsity in meta and continual learning
J von Oswald*, D Zhao*, S Kobayashi, S Schug, M Caccia, N Zucchet, J Sacramento
NeurIPS 2021
Paper Preprint Code


Task-Agnostic Continual Learning via Stochastic Synapses
S Schug, F Benzing, A Steger
Workshop on Continual Learning at ICML 2020
Paper Workshop

Evolving instinctive behaviour in resource-constrained autonomous agents using grammatical evolution
A Hallawa, S Schug, G Iacca, G Ascheid
EvoStar 2020