Deep Learning Study Group Session #2: The Backpropagation Algorithm

Yesterday evening, untapt hosted the second of a series of workshops on Deep Learning.

img_5974

As detailed in our study group’s GitHub repository, the theoretical focus of this session was backpropagation, a principal algorithm for reducing error within artificial neural networks. In particular, econometrician Katya Vasilaky led an illuminating, real-world example-filled discussion on its:

  • four fundamental equations
  • intuitive purpose
  • relationship with the cost function and gradient descent, including key assumptions
  • role in calculating layer-by-layer error within a feedforward network
  • various algorithmic implementations

img_5963

In addition, we reviewed our initial efforts at developing familiarity with the high-level deep learning library Keras, including playing with some of their simpler networks for classifying MNIST digits and Reuters news items.

For our next session, we’ll be tackling techniques and Keras-based tools for broadly improving the way neural networks learn.

Facebooktwittergoogle_plusredditpinterestlinkedinmailFacebooktwittergoogle_plusredditpinterestlinkedinmail

Jon is the Chief Data Scientist at untapt. Previously he worked as Data Scientist at Omnicom in New York and as an algorithmic trader in Singapore. As a Wellcome Trust Scholar, Jon obtained a doctorate in neuroscience from Oxford University. He enjoys writing on scientific and technological advances, particularly in statistics and machine learning, with an emphasis on the societal impacts of these developments.