

Deep Learning Specialisation is a series of 4 courses by Prof. Andrew Ng on Coursera that attempts to democratise neural nets(Prof. at Stanford University, Co-Founder of Coursera)
Towards the last quarter of 2020 (which felt longer than it was) I tried exploring a more rigorous course on Neural Nets to complete. I have wandered around the internet and libraries to figure out what will stick. In this blog post I try to summarise different phases of the learning journey.
Call to adventure
I have been learning about the potential of AI to disrupt Law and Justice space for sometime now. Since Law is a very status quo driven and is generally change resistant the thought that one could fundamentally challenge the way it works seemed really cool. This primarily gave me the impetus to learn Neural Nets and Deep Learning and I jumped straight into it.
Crossing the threshold
I am well acquainted with Python and OOP but because programming has not been my primary work for many years now I felt my skills were a little rusty. Other than that I was confused between the choice of primary library(ies), the MOOC and the infrastructure choice that is available for coding. Below are the choices and their order:
- Libraries: Pure Python and then both PyTorch and TensorFlow (in that order). Pure Python gave me a lot of intuition into how NN was going to work for more understandable problems like logistic regression etc. It was also my choice because Coursera course would help me build these things in an organised manner. In your real work you will face both PyTorch and TensorFlow environment so might as well learn both. And yes familiarity with one first will help. (Dr. Jon Krohn on PyTorch vs TensorFlow: https://www.youtube.com/watch?v=Be5QwA-yDJE)
- MOOC: Coursera Deep Learning specialisation, Deep Learning Illustrated and Fast.ai (in no specific order). I ended up paying for Coursera DL specialisation which was around Rs. 3500 per month till the time you need to finish all four course. I really liked the way Prof. Andrew combines the mathematics with programming to give a wholesome flavour of the course. I started with Fast.ai and though, true to its objectives, it helps you get started blazing fast (~4 lines of code for an image classifier) I felt that I still needed some grounding on the algebra side of things. Later I realised that this is something that Jeremy covers in second part of Fast.ai MOOC. (Note: (1) If you are thinking of doing fast.ai MOOC I highly recommend watching first videos of both part 1 and part 2 to get a broad sense of what’s where. (2) nbdev is mind boggling and if you don’t want to continue with Fast.ai to learn DL please do look for nbdev tutorial from Jeremy.)
- Infrastructure: (Don’t worry about this if Coursera is your plan, but do read on) There are several choices for infrastructure ranging from Google Colab to AWS to Paperspace (read first section). Depending on where you are on the learning curve you can chose one over the other. Coursera gives you access to Jupyter Notebook environment within their lab ecosystem. You should try using Google Colab to reduce the cost and then move to AWS if you are building a product or service. I did not quite enjoy Paperspace experience which is recommended by Fast.ai, specially since the Fast.ai book is available with Colab and AWS as well.
The road of trials: Phew! That was a lot of stuff. I must have seen 10–15 different tutorials on YouTube and was super focused on Fast.ai but felt that some linear algebra would be good. I think it also comes from how I have traditionally been taught about some of these subjects. Coursera Deep Learning specialisation, as Prof. Andrew Ng says, is one of the most efficient ways to learn about the subject. I highly recommend this video for motivation, inspiration or just to listen to a good AI podcast.
Abyss
I finally paid for the DL specialisation on Coursera and the tough road started. I started with high intensity and finished 2 weeks worth of content in 3 days of really focused study. What really helped me was to keeping notes of what I was learning and trying to teach myself with that. It also helped to reproduce some of the equations and do some mental math on the Matrix sizes. I used this tool to visualise matrix multiplication etc. But then I stopped.
It was tough to maintain the momentum as the course has a very good mathematical bent from the very beginning. I felt as if I was getting too deep and I like going a little wide to get a sense of the whole space. I looked through the internet to find something that has the balance of both the depth as well as the breadth. For next week or two I just did not have the strength to move on.
Metamorphosis
One of the things that I have started to do since past few years is to read books on topics that I am trying to learn. While this may seem rather elementary as an idea to many I have been amazed at what and how things can be learned if you bring it into practice (also you need dedicated time). I was trying to set up my computer at the same time as I was trying to finish my Coursera course and that is when I encountered Dr. Jon Krohn’s talk on PyTorch vs TensorFlow. I really liked the talk and ended up buying Deep Learning illustrated and its been amazing. Rather than trying to find the perfect resource I started using two different ones. From Coursera I got a good grasp of the subject matter and from the book I got the language to express my ideas in DL. It got exciting after that as I supplemented the rigour of the specialisation course with such a lucid read of the book.
The ultimate boon
The last 2 weeks were very exciting as now I could link multiple stuff in my head and they made sense. I think the idea of moving from explicit for-loop to a vectorized implementation for ANN and DL generally was the biggest realization for me as it helped me get a lot of intuition into the common debuging issues like different matrix sizes etc. I kind of breezed through the next couple of weeks and finished my specialisation.
Returning the boon
Okay so now what? Can I build a chatbot? — No. Can I write a DL program to distinguish between Cats and Dogs? — Yes. Do I have enough skills to get a job? — Probably not. I don’t think I am skilled enough to program all of these things from scratch. No one does it. I could use existing tools and hack a program together that will work best, but having fundamentals clear helps you navigate problems I will encounter in the future. I am going to the next course in the series and I hope it turns out just as exciting a journey as this one was.