❤️ Become The AI Epiphany Patreon ❤️
[ Ссылка ]
👨👩👧👦 Join our Discord community 👨👩👧👦
[ Ссылка ]
Watch me code a Neural Network from Scratch! 🥳 In this 3rd video of the JAX tutorials series.
In this video, I create an MLP (multi-layer perceptron) and train it as a classifier on MNIST (although it's trivial to use a more complex dataset) - all this in pure JAX (no Flax/Haiku/Optax).
I then add cool visualizations such as:
* Visualizing MLP's learned weights
* Visualizing embeddings of a batch of images in t-SNE
* Finally, we analyze the dead neurons
Credit:
Got the inspiration from the official advanced JAX tutorial here: [ Ссылка ]
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
✅ Get started with JAX GitHub: [ Ссылка ]
✅ Dead neuron article: [ Ссылка ]
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
⌚️ Timetable:
00:00:00 Intro, structuring the code
00:03:10 MLP initialization function
00:13:30 Prediction function
00:24:10 PyTorch MNIST dataset
00:31:40 PyTorch data loaders
00:39:55 Training loop
00:49:15 Adding the accuracy metric
01:01:45 Visualize the image and prediction
01:04:40 Small code refactoring
01:09:25 Visualizing MLP weights
01:11:30 Visualizing embeddings using t-SNE
01:17:55 Analyzing dead neurons
01:24:35 Outro
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💰 BECOME A PATREON OF THE AI EPIPHANY ❤️
If these videos, GitHub projects, and blogs help you,
consider helping me out by supporting me on Patreon!
The AI Epiphany - [ Ссылка ]
One-time donation - [ Ссылка ]
Huge thank you to these AI Epiphany patreons:
Eli Mahler
Petar Veličković
Bartłomiej Danek
Zvonimir Sabljic
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
💼 LinkedIn - [ Ссылка ]
🐦 Twitter - [ Ссылка ]
👨👩👧👦 Discord - [ Ссылка ]
📺 YouTube - [ Ссылка ]
📚 Medium - [ Ссылка ]
💻 GitHub - [ Ссылка ]
📢 AI Newsletter - [ Ссылка ]
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
#jax #neuralnetwork #coding
Ещё видео!