This page only deals with the algorithm theory for example of this theory check out newtons-raphson-method-for-arriving-at.html Alright the theory says for a function y = x - a , the optimization of parameter ' x ' where y = 0 /close to zero a technique used for optimization of parameters in machine learning problem, consider a = 5 , x = 5 , y will be zero but this case is not straight forward the value for x is obtained using Newton–Raphson method , which is via iteration says x n+1 = x n - f(x)/f '(x) f(x) = y = x- a f '(x) = partial derivative of x = dy/dx = 1 consider 2 iterations here n =0 ,1 it starts with assuming a initial value for x which is n =0 x 0 , y 0 plotting the values in graph from the graph u can see i have plotted the x0, y0 tangent line to the point cuts the x - axis at x1 where the y is '0' says loss is zero at iteration x1, u may now wonder the the value of X has moved from X0 ...
talks about machine learning ,IOT and web apps