In previous class we learn what is MLP? but we listen a term called gradient descent today we study gradient descent. Gradient descent In mathematics, gradient descent is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. The idea is to take repeated steps in the opposite direction of the gradient of the function at the current point, because this is the direction of steepest descent. Here it is explained by image
Multilayer Perceptron Multilayer Perceptron is collection of Perceptron ANN(MULTILAYER PERCEPTRON) Backpropagation Backpropagation is a process to find weights and baises Step 1 : Initialize all weight and biases Step 2:Choose a Loss function Step 3: Apply Gradient Descent Technique to find correct weights and biases
How to train a neural network for training regression model? In previous class we discus about to train multiclass classification to convert it to regression model we have to just change last neuron to single perceptron
Comments
Post a Comment
Hi friends write your comments