30 Days, 30 Deep Learning Projects
After the amazing success of my 30 Days, 30 Machine Learning Projects Challenge, I’m excited to take on a new challenge!
While completing the ML challenge, I found myself fascinated but occasionally overwhelmed by Deep Learning concepts. That’s when I knew I had to create a new challenge: 30 Days, 30 Deep Learning Projects.
This challenge is designed for gradual learning—starting with the basics of neural networks and ending with more advanced topics like GANs, Transformers, and BERT. I’ve carefully curated a list of projects, ensuring that the complexity builds up week by week. It’s time to dive deep!
Week 1: Neural Networks Fundamentals
Week | Day | Project | Dataset Source |
---|---|---|---|
1 | 1 | Predict house prices using a feedforward neural network (NN) | Boston Housing Prices |
1 | 2 | Classify handwritten digits using a simple NN on MNIST | MNIST Dataset |
1 | 3 | Explore backpropagation theory and tweak learning rates (MNIST) | MNIST Dataset |
1 | 4 | Compare activation functions (ReLU, Sigmoid, Tanh) on Fashion MNIST | Fashion MNIST |
1 | 5 | Experiment with optimizers (SGD, Adam, RMSprop) using a pre-built CNN on CIFAR-10 | CIFAR-10 Dataset |
1 | 6 | Apply dropout and regularization (L2) for overfitting control (Titanic Dataset) | Titanic Dataset |
1 | 7 | Fine-tune hyperparameters with Keras Tuner on a small NN | Telco Customer Churn Dataset |
Week 2: CNNs and Computer Vision
Week | Day | Project | Dataset Source |
---|---|---|---|
2 | 8 | Build a simple CNN for CIFAR-10 image classification | CIFAR-10 Dataset |
2 | 9 | Modify CNN with pooling layers and visualize filters | CIFAR-100 Dataset |
2 | 10 | Use pre-built data augmentation methods in Keras on Fashion MNIST | Fashion MNIST Dataset |
2 | 11 | Apply Transfer Learning with VGG16 for a simple classification task | Cats vs Dogs Dataset |
2 | 12 | Implement YOLO for object detection (tutorial-based approach to simplify) | Tutorial: YOLOv3 |
2 | 13 | Explore image segmentation with U-Net for a small portion of Carvana dataset | Carvana Image Masking Dataset |
2 | 14 | Mini-Project: Building a Custom CNN-based Student Model Using a Pre-Trained Teacher Model | Use Kaggle Datasets |
Week 3: RNNs, LSTMs, and Time Series
Week | Day | Project | Dataset Source |
---|---|---|---|
3 | 15 | Prepare a simple time series dataset (Jena Climate or stock data) for RNN model | Jena Climate Dataset |
3 | 16 | Build a basic RNN model for sequence prediction (temperature forecasting) | Jena Climate Dataset |
3 | 17 | Build an LSTM model for sentiment analysis (IMDb Dataset) | IMDb Reviews Dataset |
3 | 18 | Add attention mechanism to LSTM model for machine translation (split: theory on Day 18, code Day 19) | English-French Dataset |
3 | 19 | Continue attention mechanism (implement and test it) | English-French Dataset |
3 | 20 | Build an autoencoder-based anomaly detection system (part 1: data and model setup) | Network Traffic Anomaly Dataset |
3 | 21 | Fine-tune and evaluate autoencoder model for anomaly detection | Network Traffic Anomaly Dataset |
Week 4: GANs, Transformers, and Advanced Topics
Week | Day | Project | Dataset Source |
---|---|---|---|
4 | 22 | GAN Basics: Understand GAN architecture and set up the framework (on MNIST or Fashion MNIST) | MNIST Dataset |
4 | 23 | Train and evaluate the GAN (continue from Day 22) | Fashion MNIST Dataset |
4 | 24 | Build a Conditional GAN (CGAN) for generating specific images (Fashion MNIST) | Fashion MNIST Dataset |
4 | 25 | Implement CycleGAN for style transfer (e.g., horse to zebra conversion) | CycleGAN Dataset |
4 | 26 | Train the CycleGAN on a smaller image set (like horse2zebra) | CycleGAN Dataset |
4 | 27 | Build a simple transformer-based model (BERT) for text classification (IMDb Dataset) | IMDb Movie Reviews |
4 | 28 | Fine-tune the BERT model on a custom NLP task | IMDb Movie Reviews |
4 | 29 | Work on SimCLR self-supervised learning or GPT-based text generation project (split into two parts) | SimCLR Tutorial: SimCLR Paper, GPT-2: Hugging Face |
4 | 30 | Final Capstone: Finish any ongoing project or combine techniques for a final challenge (GANs, NLP) | Explore Kaggle Competitions or Real-world Challenges |
Ready to Dive In?
I’m planning to start this Deep Learning challenge on 1st November, so I’ll take the time before then to brush up on the theory. I’ll also share additional resources and the reading list by 19th October(I have added them, checkout Resource list). If you’re ready to dive into deep learning, stay tuned, and feel free to join me on this journey!
Let’s master Deep Learning one project at a time! 🚀
Pro Tip: Start small, stay consistent, and before you know it, you’ll be building complex models like a pro!