Haileybury ML¶
Alexander Sherstnev ashrstnv@mit.edu
Lessons:¶
- Wednesday 01.14.2026 - training a simple neural network to predict housing prices
- Monday 01.19.2026 - predict human activity from sensor data, more advanced dense NN
- Wednesday 01.21.2026 - classify images using a convolutional neural network
- Friday 01.23.2026 - (almost) all the math you need to know to train a simple neural net
- Tuesday 01.27.2026 - generating text with a recurrent neural network
- Thursday 01.29.2026 - ???
Setup¶
Each lesson has a command you can run to download the starter code and set up your environment.
If you already have code downloaded, cd into the folder where it is, and run uv run jupyter-lab to edit the code. Ask chatgpt if you don't know how cd works or if your uv isn't working.
Important papers¶
ML researchers publish scientific papers about their discoveries. The original papers are often a really good way to understand a topic
- Adam: A Method for Stochastic Optimization - Adam is the optimizer we use. This is pretty advanced but explains the nuance of gradient descent
- ImageNet Classification with Deep Convolutional Neural Networks - AlexNet was a significant step forward in image recognition using CNNs.
- Attention is All You Need - Transformers are the most important discovery in recent ML, and the reason LLMs work. This might be the most famous ML paper ever.
- The Unreasonable Effectiveness of Recurrent Neural Networks - Fun to read blog post with some intuition on how recurrent neural networks (RNNs) work.
Detailed references¶
Explain topics thoroughly, might be very hard to understand.
- Backpropagation - how do we actually take the gradient of the loss
- (Batch) Normalization - why do we normalize