* – equal contributions

2024

Attention as a Hypernetwork
S Schug, S Kobayashi, Y Akram, J Sacramento, R Pascanu
Preprint
Preprint Code

When can transformers compositionally generalize in-context?
S Kobayashi*, S Schug*, Y Akram*, F Redhardt, J von Oswald, R Pascanu, G Lajoie, J Sacramento
NGSM workshop at ICML 2024
Preprint Workshop

Discovering modular solutions that generalize compositionally
S Schug*, S Kobayashi*, Y Akram, M Wołczyk, A Proca, J von Oswald, R Pascanu, J Sacramento, A Steger
ICLR 2024
Paper Preprint Code

2023

Would I have gotten that reward? Long-term credit assignment by counterfactual contribution analysis
A Meulemans*, S Schug*, S Kobayashi*, N Daw, G Wayne
NeurIPS 2023 (spotlight)
Paper Preprint Code Tweet

Online learning of long-range dependencies
N Zucchet*, R Meier*, S Schug*, A Mujika, J Sacramento
NeurIPS 2023
Paper Preprint Code

A complementary systems theory of meta-learning
S Schug, N Zucchet, J von Oswald, J Sacramento
Cosyne 2023
Poster

2022

A contrastive rule for meta-learning
N Zucchet*, S Schug*, J von Oswald*, D Zhao, J Sacramento
NeurIPS 2022
Paper Preprint Code Tweet

Random initialisations performing above chance and how to find them
F Benzing, S Schug, R Meier, J von Oswald, Y Akram, N Zucchet, L Aitchison, A Steger
OPT2022 workshop at NeurIPS 2022
Paper Preprint Code Tweet

2021

Presynaptic stochasticity improves energy efficiency and helps alleviate the stability-plasticity dilemma
S Schug*, F Benzing*, A Steger
eLife 10: e69884
Paper Preprint Code Tweet

Learning where to learn: Gradient sparsity in meta and continual learning
J von Oswald*, D Zhao*, S Kobayashi, S Schug, M Caccia, N Zucchet, J Sacramento
NeurIPS 2021
Paper Preprint Code

2020

Task-Agnostic Continual Learning via Stochastic Synapses
S Schug, F Benzing, A Steger
Workshop on Continual Learning at ICML 2020
Paper Workshop

Evolving instinctive behaviour in resource-constrained autonomous agents using grammatical evolution
A Hallawa, S Schug, G Iacca, G Ascheid
EvoStar 2020
Paper