AI, Deep Learning Basics/Basic

[AI] Neural Networks and Deep Learning 단원 정리

Deep Learning.ai Specialization 

(https://www.coursera.org/specializations/deep-learning?

Professor Andrew Ng

 

Course1 : Neural Networks and Deep Learning

 

 

  • (Week 1: Introduction to Deep Learning)

 

It was just a normal introduction of Deep Learning. If you don't enough time to watch, I recommend you to skip.    

 

  • Week 2: Neural Networks Basics

 

In this week, you will be on the practical stand to start Neural Network in one training data.

Starting with Binary Classification, you will learn Logistic Regression(+ Cost function and Sigmoid function) and Gradient Descent(to adjust parameters). Also, you will learn just a simple concept of backpropagation(which is also related to adjusting parameters).

I was actually confused about the notion of m training data, which means there are m training sets, not one training data that consists of m training data.

 

    • Binary Classification: results

    • Logistic Regression

      • Cost function

    • Gradient Descent

      • Derivative

    • (Back Propagation)

    • Handling one training data out of m training data

 
 
  • Week 3: Shallow Neural Networks

In this week, you will deal Neural Network in a large amount of data(m training data). You will apply this data in one hidden layer. Gradually learning one unit and other parameter information, one hidden layer. I actually struggled on the dimensions of the matrixs.
 
    • Dealing m training data

    • One hidden Layer

      • One unit of one hidden layer

        • Vectorization

        • Setting W(parameter)

        • Activation function

          • Sigmoid function

          • tan h(x)

          • ReLU

          • Leaky ReLU

    • A calculation that comes from one hidden Layer

 
 
  • Week 4: Deep Neural Networks

In this week, you will deal m training data in several hidden layers. To adjust the parameters, you will learn forward and backward propagation too. I actually really liked this week's lecture, because it summed up what we learned for the past month. (Also professor teaches well.:) )
    • Dealing with several hidden layers

    • Forward Propagation + Backward Propagation

      • Adjusting Parameters: dW, db, W, b

      • Hyperparameter와 Parameter