October 31, 2017

Silent but not Idle

It’s been quite some time since my last post. My silence hasn’t been because of idleness. Quite the opposite.

Most of my time this year has been dedicated to gaining a strong understanding of deep learning. I built a solid theoretical foundation by reading Ian Goodfellow’s wonderful book Deep Learning and watching lecture videos from Stanford’s CS231n, Convolutional Neural Networks for Visual Recognition. I rounded out my knowledge by reading various papers, watching talks on YouTube, etc. The Internet is an endless trove of resources (or are they distractions?).

To put my deep learning knowledge into practice, I invested a lot of time into learning TensorFlow. This isn’t a criticism against the other frameworks (eg. Caffe, MXNet) but rather a personal choice based on its popularity and support for different levels of abstraction.

I’ve applied most of this deep learning expertise at work taking an in-depth look at the run-time characteristics (eg FLOPS, memory consumption/bandwidth, etc) of distributed training and inference with and without GPU acceleration. But, since this is a personal blog, I avoid talking about work stuff here.

This year, I completed Coursera’s Probabilistic Graphical Models specialization. That actually took a fair amount of work and time to finish but it was well worth it. This was a powerful addition to my toolbox and I’m eager to apply probabilistic graphical models in practice.

I also spent a fair amount of time learning about non-parametric Bayesian techniques, in particular the time series analysis work by Prof. Emily Fox. Real-world problems are messy, non-parametric Bayesian techniques look like a great way to deal with some of the mess.

Along the way, I’ve started a couple of side projects and Kaggle competitions but never reached a blog-worthy point in any of the efforts. To date, I’ve been hesitant to post tutorials, etc, as there’s already so many available - I’m not sure the world needs another deep learning 101 convolutional cat classifier. But, perhaps the occasional tutorial would be a good way to keep momentum on posting and practice the Feynman Technique.

Tags:  Meta , Coursera , DeepLearning , PGMs