For ‘min_steps’: base_step = (epsilon)^(1./2.5)
step_ratio = 2
num_steps = 50
The plot is as shown below.
## Comparison with autograd :
Autograd is a python library which computes derivative using automatic differentiation. In order to cross-test and analyze the scipy.diff, functions from scipy.special and scipy.optimize were used. The accuracy results are as follows:
Scipy.diff is less accurate than autograd because of the difference in technique that autograd and scipy.diff uses for computing the derivatives. However, scipy.diff is more versatile and can work on more complex functions.