What you'll learn
Practical aspects of Deep Learning
Hyperparameter tuning, Batch Normalization and Programming Frameworks
This course will teach you the “magic” of getting deep learning to work well. Rather than the deep learning process being a black box, you will understand what drives performance, and be able to more systematically get good results. You will also learn TensorFlow.
After 3 weeks, you will:
– Understand industry best-practices for building deep learning applications.
– Be able to effectively use the common neural network “tricks”, including initialization, L2 and dropout regularization, Batch normalization, gradient checking,
– Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence.
– Understand new best-practices for the deep learning era of how to set up train/dev/test sets and analyze bias/variance
– Be able to implement a neural network in TensorFlow.
This is the second course of the Deep Learning Specialization.
Access to a computer or mobile device with an internet connection.
Motivation to learn!
There are no special materials or prerequisite knowledge required for this course.
Who this course is for
Students who are new to this field
Students willing to put in a couple hours to learn about Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization
Advanced students wanting to add another skill to their portfolio
Andrew Ng – CEO/Founder Landing AI; Co-founder, Coursera; Adjunct Professor, Stanford University; formerly Chief Scientist,Baidu and founding lead of Google Brain
Head Teaching Assistant – Kian Katanforoosh – Lecturer of Computer Science at Stanford University, deeplearning.ai, Ecole CentraleSupelec
Teaching Assistant – Younes Bensouda Mourri – Mathematical & Computational Sciences, Stanford University, deeplearning.ai
This course includes
Option for learning at your own pace
Videos and reading material about the course
Assessed tasks with feedback from other course participants
Evaluated tests with feedback
Evaluated programming tasks