Alright the Newton–Raphson method is used here to find the approximate value
of some variable.
looks weird right ,the idea is to try out some technique to find the
parameter values of a function so that the loss is minimum /close to '0'
The method derives the equation as
xn = x n-1 - f(x) /f '(x)
f(x) is some function
f '(x) is the derivative of x
here i have tried some function of my own or u could see the wiki ref for example
from below u can the 1st iteration value ,but to minimse the loss we need to go for further iterations.
lets see how the 2nd iteration performs
we have arrived at the value of x at x1 so the loss which is y is zero
so what we have achieved with this u can try out some other equation
to get the approx value , still thinking
consider another equation x ^2 = a ,this is to find the root of a ,meaning
x = sqrt( a) .
so the idea here to to optimize some parameter now it is 'x'
could be other parameter 'm ' or ' w1' which we use in machine learning
y =mx +b
y =w1X +b
but newton's method has its own failure which we will see with some example
so this should give an idea of an algorithm for optimization.
cool then bye for now - thanks.
reference -
https://en.wikipedia.org/wiki/Newton's_method
Comments
Post a Comment