Autograd is a python library which computes derivative using automatic differentiation. In order to cross-test and analyze the scipy.diff, functions from scipy.special and scipy.optimize were used. The accuracy results are as follows:
Scipy.diff is less accurate than autograd because of the difference in technique that autograd and scipy.diff uses for computing the derivatives. However, scipy.diff is more versatile and can work on more complex functions.
This blog is dedicated to the third week of Google Summer of Code (i.e June 16 - June 23). But first, a brief insight is given about how the code works.
This blog is dedicated to the second week of Google Summer of Code (i.e June 8 - June 15). The target of the second week according to my timeline was to implement the Jacobian and gradient using numdifftools.
This blog is dedicated to the first week of Google Summer of Code (i.e June 1 - June 7). The target of the first week according to my timeline was to get conversant with the code structure and implement the derivative using statsmodels and partly by numdifftools.
This is the basic outline of the design of the scipy.diff module. I have thoroughly investigated the functioning of statsmodels and numdifftools. Along with it, scipy.misc.derivative has also been looked at. In scipy.diff I propose to have adaptations from numdifftools and statsmodels majorly due to their accurate results, sophisiticated techniques of computation and ease of accessibility.