Paper Overviews
-
Modelling the neural code in large populations of correlated neurons
In this post I’m going to show how to use the
Continue reading...neural-mixtures
command-line interface (CLI) to generate the key results and corresponding figures from my paper Modelling the neural code in large populations of correlated neurons (Sokoloski, Aschner, and Coen-Cagli 2021). Along the way I will also explain how interested users can format new datasets for use with theneural-mixtures
CLI.
Goal Tutorials
-
Introduction to gradient pursuit on manifolds
In this post we’ll go through the gradient-descent script, and see how to find the maximum of a function with gradient descent in Goal. This post won’t demonstrate why Goal is worth using, but it will serve to introduce many of the basic types and functions which constitute the Goal libraries. Moreover, we’ll see that relatively straightforward things remain relatively straightforward in Goal.
Continue reading... -
Fitting a mixture of von Mises distributions
In this post we’ll go through the von-mises-mixture script, and see how to fit a mixture of von Mises distributions with Goal. I have developed the Goal libraries primarily to do type-safe statistical modelling, and this post will touch on many of the features of Goal that make it worth using.
Continue reading... -
Lazy backpropagation
In this post we’ll go through the neural-network script, and see how to fit a simple neural network in Goal. I will then show how I’ve implemented an implicit version of backprop by combining Goal types with lazy evaluation, and thereby avoid explicitly storing and passing gradients. Explaining this implementation of backprop is my primary interest in this tutorial, as it is general, efficient, and (I think) kind of cool.
Continue reading...