Chevron Left
Back to Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

Learner Reviews & Feedback for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization by DeepLearning.AI

4.9
stars
62,825 ratings

About the Course

In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically. By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI....

Top reviews

XG

Oct 30, 2017

Thank you Andrew!! I know start to use Tensorflow, however, this tool is not well for a research goal. Maybe, pytorch could be considered in the future!! And let us know how to use pytorch in Windows.

JS

Apr 4, 2021

Fantastic course and although it guides you through the course (and may feel less challenging to some) it provides all the building blocks for you to latter apply them to your own interesting project.

Filter by:

7026 - 7050 of 7,216 Reviews for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

By Navaneethan S

•

Sep 20, 2017

This course was much less rigorous and theoretically-grounded than the first. There didn't seem to be much justification for any of the techniques presented, which was a stark contrast to the first course.

However, the topics are important and useful to know, so I'm glad they were covered. To me, the most useful sections were on softmax regression and deep learning frameworks, which I really enjoyed. The TensorFlow assignment was also interesting and (relative to the others) challenging.

I think there is a lot of scope for this course to be improved and I hope Dr Ng and team will do so in the near future.

By André L

•

Feb 14, 2023

The content is didatic, as well as the explanations, a reasonable course for a real beginner. However, the material is one of the worst I have seen: a lot of errors that are indicated with notes between the classes and MANY annotations and sketches from Andrew in the slides. It mixes up handwritten annotations with digital text, a complete mess. I had to edit the PDF in order to make something useful, even though a lot of information is either missing or floating somethere in the slide. Besides that, some videos are not edited properly: it is possible to experience many repetitions of the same phrase.

By Stefano M

•

Apr 8, 2020

(+) On the plus side: Andrew is always an excellent lecturer. Also, the python notebooks provided for the assignments are an extremely good guidance for structuring a deep neural network project.

(-) On the minus side: this course is rather disappointing compared to Andrew's well-known Machine Learning course on coursera. There is basically no challenge, as assignments (or, I would call them, "tutorials") are *very* guided: they can be completed even with a very shallow understanding of the content. Also, lectures are quite repetitive, and more like a practical cookbook than an actual course.

By Peter G

•

Dec 5, 2017

Nice course, but again, main emphasis on the practical side and 'never mind, you don't need to know the details' approach. Having optional parts where theory about batch-normalization implementation and softmax derivative derivation could be shown - that would be very desirable. Another not so great thing is that final TensorFlow-related practice exercises are too 'quick' in a sense that 99% of the code is written for you and hints are given in such a way that you literally don't even have to use a half of your brain. That is also frustrating, when everything is already done for you.

By Minglei X

•

Oct 22, 2017

Some process that was discussed in details in previous courses are mostly omitted in new context. While it is sometimes nice for saving time and focusing on new ideas, I feel like there are sometimes subtleties in them. Like I could not imagine how backward propagation should be implemented in batch norm. I'm not sure if it's because there are really some subtleties that you think it's too tedious and not necessary to introduce in the short video. If it is the case, I still hope you could provide more detailed information about them somewhere, just for curious people like me.

By Ashvin L

•

Aug 24, 2018

The course builds up on the first course and provides some ideas on how to tune the networks to perform better. However, at the core, I find the number of parameters overwhelming and it appears that by changing the parameters we can get any answer we want. There is no "formal" and mathematical basis for changing the parameters. This is a bit disconcerting.

The assignments were trivial. More importantly, at least one assignment appeared to indicate that the results are entirely dependent on weights chosen (at random) on the first iteration. This should not be the case.

By Foad O

•

Nov 2, 2021

The course is pretty good overall. However, the programming assignments need much improvement. I realize that teaching Python syntax and programming is not really part of this course, but if students are expected to do coding, there needs to be some more detailed lessons/sections to cover the basics. While providing vague, inconsistent and riddle-like "hints" in the middle of the programming exercises make for some interesting brain exercises, they are certainly not helpful at teaching the students what they need to know in order to write correct code.

By Vikash C

•

Jan 28, 2019

Content was good.

But the system that checks our submitted our code checks wrongly even when I wrote it correctly.

In week 2 assignment, when I submitted the code, it gave many functions as wrong coded.

I resubmitted the code after few changes, for instance a+= 2 changes to a = a+2 and string text like 'W' changes to "W". It worked fine and gave 100 points.

In short, what I observed is that the code checking system is taking a+=2 and a=a+2 as differently, also 'W' and "W" are considered different, but they are not in actual output.

By W B K

•

Oct 1, 2018

I thought the content was well-chosen and typically presented clearly. However, unlike the previous course in this specialization, the assignments had an egregious number of typos and missing information. I found these errors confusing and time-consuming.

From the staff's forum activity, it looks like they are no longer actively involved in this course. I hope that Coursera will hire someone—an intern would probably be plenty capable—to take this course and carefully fix as many of the errors in it as she or he can find.

By Zbynek B

•

Jun 9, 2020

This is my third course by Prof. Ng, which I passed all with 100% score track. So far, I gave always 5 stars. This time, however, just three because of (1) weak explanation of the Dropout method (intuition) and (2) missing gradient for the extra gamma parameter (Batch Norm method). It isn't a big deal for the student to derive the gradient. However, I expected Andrew at least to mention that gradient for the back propagation step.

All in all I love the teaching style by Prof. Ng and I fully recommend them.

By Kristof B

•

Apr 8, 2021

While i like the theoretical part of the course, the programming assignments need a lot of work. Foremost there is the issue of TensorFlow 1 being used. It isn't even the latest version of TensorFlow 1, but a very old one at that. Aside from that courses use too much hand holding, i find myself deliberately scrolling past information blocks so that i actually need to do some work. Otherwise it would just be copy pasting, or in other words, a waste of time.

By Robert M

•

Jan 12, 2022

I enjoyed the lectures by Dr. Ng. There are very clear and well explained. I feel I have a good theoretical understanding of the concepts. The practical aspect is quite different. The exercises lack explanations, especially TensorFlow. You write a few lines of code and "congratulations, you have written your own NN!" while they seemly randomly transform and transpose your data without explanation. You hardly leave the course feeling like an expert.

By Egnatious P

•

Apr 19, 2020

This was an interesting course in that it taught me a lot about hyperparameter tuning and how to improve my models in general. My main issue was that the optimization assignment couldn't open properly due to jupyter notebook issues and I didn't receive any support or direction on the issue. I just stumbled on the solution myself and this significantly messed up with my timelines. I wish there was more support for technical issues as well

By Dimitrios G

•

Nov 28, 2017

The course continues on the same path the previous Deep Learning course has set but I found the use of TensorFlow somewhat limiting. It is a great tool that simplifies the training and running of NNs but it does not allow for easy debugging or for easy looking within the built-in functions to spot problems. I felt that we were treating many tf.functions as black boxes and I am not so fond of this. Otherwise the course was fairly useful.

By George S

•

Jun 27, 2022

Please.... Andrew is awesome, deeplearning.ai is awesome, DLS is also awesome. BUT why tucking that last programming assignment about tensorflow, it's ruined the whole course..... man what did i learn after that assignment...nothing! abs. nothing... lots of crammed coding, keep getting answers from forums and now I pass the grading and remember abs. nothing.... too non-idea the tensorflow commands are like greek...

By Hamad U R Q

•

Sep 12, 2019

too easy.

One thing about Week 3 that I want to say, I had some confusions in the lectures but was hopeful that while going through the assignment I will clear out the concepts about tuning Hyper-parameters but instead, the assignment was ALL about tensorflow basics and nothing about tuning Hyper-parameters. I was really disappointed with that!

Other than that, course contents are great and worth the time and effort.

By Fermin B

•

Mar 7, 2021

The course it's very good, but the reason I didn't put 3 stars is because it was difficult. I had the impression that the course was going too fast and I wasn't able to fully understand all the contents that the teacher gave. I think the assignments should be more similar to the first course, where you go step by step, understanding everything about the code. More explanations about tensorflow would be appreciated.

By Younes A

•

Dec 7, 2017

Wouldn't recommend because of the very low quality of the assignments, but I don't regret taking them because the content is great. Seriously the quality of deeplearning.ai courses is the lowest I have ever seen! Glitches in videos, wrong assignments (both notebooks and MCQs), and no valuable discussions on the forums. Too bad Prof Ng couldn't get a competent team to curate his content for him.

By Christian M

•

May 15, 2022

The theoretical part was clearly understandable but the programming assignment was very poor in my opinion.

Did I miss the introduction to tensorflow somewhere? I could not find it in the cousre. It was possible to solve the assignments with guessing and reading some forum posts. But honestly I did not understand very much...

I'm a bit disappointed about the introduction to tensorflow.

By Gadiel S

•

Sep 21, 2018

The course is good. It covers important ideas, and they are well explained in the videos. However, the formulation of the assignments is sloppy. There are mistakes and inconsistencies, in some cases necessary explanations are missing, and in some cases the instructions are misleading (I suspect the assignment has changed over time, but the instructions have not been consistently updated).

By Ha S C

•

Oct 28, 2018

A much sloppier and poorer course than previously. Grading mishaps (on the fault of the grader), a few errors in the lectures (the variance in the normalization), and very basic and unhelpful feedback from staff made for a course that did not live up to the level of the previous one. If at any point you need further help, it is generally unavailable, or difficult to find at best.

By Ashkan R

•

Dec 23, 2020

I really like the course material, topics discussed, and neural networks in general. I also have a lot of respect and gratitude toward Andrew, but the way he organized quizzes and programming assignments are rather a monkey-see-monkey-do strategy. You rarely get challenged. Overall the course is worth taking, but I would not recommend this to more advanced practitioners.

By Siddharth D

•

Apr 24, 2020

I have written this before in the discussions. I feel, there should be assignments to implement everything from scratch. I feel, i can fill in the code, and understand ,most of the mathematical functions, and reasoning, but i am still not confident that i can "CODE" a new problem from scratch. I was really hoping this certification will give me practice to achieve this.

By Maysa M G d M

•

Mar 4, 2018

Some exercises were wrong , like Z3 em tensorflow model, you said z3=w*z2+b3, but it was A2 ,not Z2.

Several exercises did not check the result for each function, so when I arrived at the huge model function, it was hard to discover where I was wrong.

I think this third week could be two. I missed exercise with normalization, there were all with tensorflow.