Skip to main content

Gradient Descent

 In previous class we learn what is MLP? but we listen a term called gradient descent today we study gradient descent.

Gradient descent

In mathematics, gradient descent is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. The idea is to take repeated steps in the opposite direction of the gradient of the function at the current point, because this is the direction of steepest descent.

Here it is explained by image



Comments

Popular posts from this blog

CNN(Convolution Neural Network)

 CNN(Convolution Neural Network) CNN is used to classify images CNN is most effective for 2D data CNN has three layers-: 1> Convolution layer for extracting feature 2> Max Pooling layer for reducing dimension 3> Flatten for Convert 2D to 1D because we have to pass to ANN prediction 4> Next ANN Network

Perceptron

 Perceptron is a general thing used in deep learning .You know what is neuron in brain Perceptron Layers 1. Input-In Input we provide Input to our model 2. Calculation-we calculate weight and biases for out Perceptron 3.Output-(sigma of wixi)+b Perceptron Architecture