Spikes, Stats, and Source Code

Hacking with the Linderman Lab

We like to write code, develop models, explore datasets, and learn new things. Sometimes we go the extra mile and publish our Colab notebooks here. Follow along with us, or better yet, make a copy and start hacking too!

Parallelizing Nonlinear RNNs with the Ungulates: DEER and ELK

By Xavier Gonzalez, December 2, 2024

Parallel computation has enabled the deep learning revolution. In sequence modeling, transformers and deep SSMs (linear RNNs) have become dominant, in part because they can be parallelized over the sequence length, allowing them set modern parallelized hardware (GPUs/TPUs) to work on long sequences. In contrast, it was thought that evaluating nonlinear RNNs was inherently sequential. But nonlinear RNNs can be parallelized too! We show you how in this blog post.


Weighing the Evidence in Sharp Wave Ripples

By Scott Linderman, January 24, 2022

Krause and Drugowitsch (2022) presented a novel approach to decoding and classifying sharp-wave ripples using state space models and Bayesian model comparison. We implemented a simple version of their method using our SSM package and applied it to some synthetic data. Spoiler alert: their method works really well!


Bayesian inference with normal inverse Wishart priors and natural parameters

By Scott Linderman, November 18, 2021

Building on the previous post, we show how to do Bayesian inference in a multivariate normal model using the normal-inverse-Wishart prior and its natural exponential family form.


Implementing a Normal Inverse Wishart Distribution with Tensorflow Probability

By Scott Linderman, October 27, 2021

The normal inverse Wishart (NIW) distribution is a basic building block for Bayesian models. Unfortunately it’s not one of Tensorflow Probability’s basic distributions, and making one with TFP building blocks was a bit tricky. This notebook shows my solution.


Structured Variational Autoencoders with TensorFlow Probability and JAX!

By Yixiu Zhao, November 13, 2020

Structured VAEs (Johnson et al, 2016) combine probabilistic graphical models (PGMs) and neural networks in one deep generative model, but they’re pretty tricky to implement. We show how to build them using TensorFlow Probability and JAX!