So what is partial derivatives ,don't worry i am not going to take a boring maths
class will go straight to the point
1.partial derivative are used in machine learning back propagation.
so all of a sudden what is this back propagation
before that there must be a forward prop or forward pass ,
ok got it don't worry we will talk about a example
1.consider data points x =[1, 2, 3] some values for our understanding
2. consider a function y =x so the values of y will be [1, 2 , 3]
3.consider a function y= x^2 so the values of y will be [1, 4, 9]
4. In real life the data points could be different and the dependent function y
could be like y =m*x +b or y = w1* x1 +w2 * x2 +b
as u can see the data plotted and the line we are trying to fit could be either the green or blue line ,
so what are tryin to achieve with this
1. plot your data as X
2. have a function Y as dependant of X
3.use of function is to predict values of y for x
since we have introduced variables like m , b , w1, w2 which are part of the function .let take y =m*x +b
lets consider y_hat as the predicted value for any x
if y_hat is same as y the function is perfect so
there is no loss = y - y_hat = 0
if loss > 0
1 . we cannot change the values of x ,the only way is to introduce a param 'm'
that can induce change .this is achieved by finding the near value of 'm' so that loss tends to zero.
2. start with a random value of m and find the loss
3.update the value of 'm' so the loss tends to zero ?how to update the value
here comes the use of partial derivative (smallest change of 'm')
I think too much of story told
consider a function y = x^2
partial derivative dy/dx = 2*x p.d- [x^n = n*x^n-1]
consider y = mx +b
p.d
dy/dx = x
dy/db = 1
consider
y = w1*x1 +w2*x2 + b
p.d
dy/dw1 = x1
dy/dw2 = x2
dy/db = 1
Comments
Post a Comment