Deep Learning Study Group Session #1: Perceptrons and Sigmoid Neurons

Last week, untapt hosted the inaugural session of a study group bent on mastering Deep Learning

It proved far more popular than we anticipated, first necessitating splitting the session over two evenings and later capping the group size.


Thank you to Ed Donner, untapt’s neural network-obsessed CEO (pictured above), who offered up our office space to the group, and who kindly provided nourishment and refreshments for hungry minds.


In addition, many thanks to the fifty-five Deep Learners who attended and contributed on both Wednesday and Thursday evening. I learned a lot from the in-depth discussions around the textbook exercises that we worked through, and am excited for our next session.


As detailed in our GitHub repository, we covered the theory of:

Using Yann LeCun’s classic MNIST data set, we also worked through straightforward tutorials to classify digits with varying degrees of accuracy:


From a theoretical perspective, we aim to eventually complete Nielsen’s neural network textbook and Goodfellow et al.’s forthcoming Deep Learning tome, filling in with the requisite mathematical, statistical and computer science foundational concepts where necessary. Simultaneously, we’ll be combining our broad mix of technical backgrounds to develop novel deep networks.


At present, we’d like to keep the study group size intimate and conversational. We are, however, ideating on future, alternative formats that could accommodate larger numbers. Please email me ( if you’re interested in the latter. 


Jon is the Chief Data Scientist at untapt. Previously he worked as Data Scientist at Omnicom in New York and as an algorithmic trader in Singapore. As a Wellcome Trust Scholar, Jon obtained a doctorate in neuroscience from Oxford University. He enjoys writing on scientific and technological advances, particularly in statistics and machine learning, with an emphasis on the societal impacts of these developments.