In June, Gabe (untapt’s machine learning engineer) and I had the pleasure of attending the International Conference on Machine Learning. This year, it was held in Times Square — conveniently on the doorstep of untapt’s headquarters in midtown Manhattan.
I was inspired by talks and panel sessions that featured the bulk of the leading figures in computational statistics today, including:
- Yoshua Bengio (at the Université de Montréal)
- Yann LeCun (of New York University but now primarily Facebook, where he’s Director of AI Research)
- Andrew Gelman (of Columbia University)
- and David Silver (who led the AlphaGo project at Google DeepMind)
One of the lectures, by Larry Jackel (with Bengio and LeCun in the front row), focused on the transformative innovation at Bell Labs in the 1980s (featuring, guess who — LeCun!) when a lot of the contemporary artificial neural network theory was born. It was made self-evident to me that it is the collaborative community around these transformative concepts that enables them to develop and ultimately flourish.
Galvanized by this thought, I inquired at that evening’s Open Statistical Programming Meetup if anyone would like to study and apply techniques from the explosively influential field of deep machine learning together. To my delight, there were a dozen such adventurous folk — from diverse backgrounds across statistics, computer science, and mathematics — in the audience, and that count has subsequently increased modestly via word of mouth.
Our Deep Learning Study Group has decided to cover two steams simultaneously:
- Fundamentals: We will work through academic textbooks and exercises so that we command strong theoretical foundations for neural networks and deep learning. Topics will cover calculus, algebra, probability, and computer science.
- Applications: We will develop hands-on experience building deep learning models. Initially, we’ll follow tutorials then we’ll move on to solving novel and illustrative data problems involving a broad range of techniques.
We will publish the material we cover online, including by committing code to GitHub, with the aim of developing a resource for engineers and scientists who would like to thoroughly understand neural networks and deep learning from the ground up.
The first meeting of the Deep Learning Study Group is coming up on Wednesday, August 17th at 6:30pm for up to two hours. In preparation, the recommended work is:
- read the first chapter of Michael Nielsen’s e-book
- complete the exercises in that chapter
- set yourself up to use TensorFlow
- work through this beginner TensorFlow tutorial involving the MNIST data set, which is also discussed in Nielsen’s first chapter
This will be hosted at untapt HQ, where we are lucky to have a neural network-obsessed CEO.
Please email me (email@example.com) if you’d like to join in the fun.