In Python (3.8) I try make a script that takes a function f(x)
as input, e.g;
f(x) = 1/x
If we define the define y = f(x)
, as a line on the euclidean space, we can calculate the distance d()
from the origin (0,0)
for each point (x,f(x))
on the line as;
d(x,y) = sqrt(x^2+(f(x))^2)
My goal is to find the x
such that the above distance is minimised. This can be done by solving
2x+2f(x)*f'(x) = 0
I will be grateful for help. Thanks.