Skip to main content

Posts

Showing posts from May, 2023

Gradient Descent

 In previous class we learn what is MLP? but we listen a term called gradient descent today we study gradient descent. Gradient descent In mathematics, gradient descent is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. The idea is to take repeated steps in the opposite direction of the gradient of the function at the current point, because this is the direction of steepest descent. Here it is explained by image

MLP( Multi Layer Perceptron )

Multilayer Perceptron Multilayer Perceptron is collection of Perceptron ANN(MULTILAYER PERCEPTRON) Backpropagation Backpropagation is a process to find weights and baises Step 1 : Initialize all weight and biases Step 2:Choose a Loss function Step 3: Apply Gradient Descent Technique to find correct weights and biases

Perceptron

 Perceptron is a general thing used in deep learning .You know what is neuron in brain Perceptron Layers 1. Input-In Input we provide Input to our model 2. Calculation-we calculate weight and biases for out Perceptron 3.Output-(sigma of wixi)+b Perceptron Architecture