Jekyll2023-02-02T20:54:48-08:00https://ashwinpathak20.github.io/feed.xmlAshwin’sSoftware Development Engineer at AmazonAshwin Pathakashwinpathak20nov1996@gmail.comhttps://github.com/ashwinpathak20/ashwinpathak20.github.io/blob/master/files/Ashwin_Pathak_Resume___Aug.pdfGSOC 2017 - Week 4 of GSoC 172017-06-30T00:00:00-07:002017-06-30T00:00:00-07:00https://ashwinpathak20.github.io/posts/2017/06/blog-post-6<p>This blog is dedicated to the third week of Google Summer of Code (i.e June 24 - July 1). This week was concentrated on cross-testing and analysis of the API with some challenging tests.</p>
<h2 id="plot-of-error-vs-steps-">Plot of Error v/s Steps :</h2>
<p>For ‘max_steps’:</p>
<p>base step = 2.0</p>
<p>step_ratio = 2</p>
<p>num_steps = 50</p>
<p>The plot is as shown below. Initially the steps are large and hence the error is substantial due to the formula error. As the steps reduces, the error reduces. After some steps, the error again starts increasing due to the presence of subtraction error.
<img src="https://ashwinpathak20.github.io/images/max_step.png" alt="max_steps" /></p>
<p>For ‘min_steps’:
base_step = (epsilon)^(1./2.5)</p>
<p>step_ratio = 2</p>
<p>num_steps = 50</p>
<p>The plot is as shown below.
<img src="https://ashwinpathak20.github.io/images/min_step.png" alt="min_steps" /></p>
<p>## Comparison with autograd :</p>
<p>Autograd is a python library which computes derivative using automatic differentiation. In order to cross-test and analyze the scipy.diff, functions from scipy.special and scipy.optimize were used. The accuracy results are as follows:
<img src="https://ashwinpathak20.github.io/images/compare.png" alt="figure" /></p>
<p>Scipy.diff is less accurate than autograd because of the difference in technique that autograd and scipy.diff uses for computing the derivatives. However, scipy.diff is more versatile and can work on more complex functions.</p>Ashwin Pathakashwinpathak20nov1996@gmail.comhttps://github.com/ashwinpathak20/ashwinpathak20.github.io/blob/master/files/Ashwin_Pathak_Resume___Aug.pdfThis blog is dedicated to the third week of Google Summer of Code (i.e June 24 - July 1). This week was concentrated on cross-testing and analysis of the API with some challenging tests.GSOC 2017 - Week 3 of GSoC 172017-06-23T00:00:00-07:002017-06-23T00:00:00-07:00https://ashwinpathak20.github.io/posts/2017/06/blog-post-5<p>This blog is dedicated to the third week of Google Summer of Code (i.e June 16 - June 23).
But first, a brief insight is given about how the code works.</p>
<h2 id="step-generators">Step Generators</h2>
<p>For computing the derivatives, steps are required with which the apprxoimation is made.
Each element in the step generators represent h in
<img src="https://wikimedia.org/api/rest_v1/media/math/render/svg/240200932143283e051efead968a0bec0134e3a0" alt="numerical differentiation" />.</p>
<p>Choosing the steps are important for approximation. If the steps are very small,
derivative becomes inaccurate due to rounding error caused by subtraction. While if steps are too large,
the results will be inaccurate due to fomrula error.</p>
<p>The below figure shows the deviations from the null error as the steps are increased or decreased to an extent.
<img src="https://ashwinpathak20.github.io/images/Figure_1.png" alt="Figure" /></p>
<p>‘max_steps’ generates a decreasing sequence while ‘min_steps’ generates increasing sequence.</p>
<p>For ‘max_steps’:
steps are calculated as:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>for i in range(num_steps):
steps = base_step * step_ratio**(-i + offset)
if (np.abs(steps) > 0).all():
yield steps
</code></pre></div></div>
<p>And for ‘min_steps’:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>for i in range(num_steps):
steps = base_step * step_ratio**(i + offset)
if (np.abs(steps) > 0).all())
yield steps
</code></pre></div></div>
<p>where base_step = base_step * step_nom.</p>
<p>### Parameters</p>
<ul>
<li>
<p>x : array, optional, default is np.asarray(1)</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>The points at which steps are to be generated.
</code></pre></div> </div>
</li>
<li>
<p>n : int, optional, default is 1</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>order of derivative.
</code></pre></div> </div>
</li>
<li>
<p>order : int, optional, default is 2</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>defines the order of the error term in the Taylor approximation used.
</code></pre></div> </div>
</li>
<li>
<p>method : {‘central’, ‘forward’, ‘backward’}, optional, default: ‘central’</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>defines the method to be used.
</code></pre></div> </div>
</li>
<li>
<p>base_step : float, array-like, optional</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Defines the start step.
Default: 2 for max_step,
EPS**(1./scale) for min_step.
</code></pre></div> </div>
</li>
<li>
<p>scale : real scalar, optional, default is 2.5</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>scale used in base step.
</code></pre></div> </div>
</li>
<li>
<p>num_steps : scalar integer, optional, default 15</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>defines number of steps generated.
</code></pre></div> </div>
</li>
<li>
<p>step_nom : array-like, default maximum(log(1+abs(x)), 1)</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Nominal step is the same shape as x.
</code></pre></div> </div>
</li>
<li>
<p>step_ratio : real scalar, optional, default 2</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Ratio between sequential steps generated.
If None then step_ratio is 2 for n=1
otherwise step_ratio is 1.6.
</code></pre></div> </div>
</li>
<li>
<p>offset : real scalar, optional, default 0</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>offset to the base step.
</code></pre></div> </div>
</li>
<li>
<p>step : {‘max_step’, ‘min_step’}, optional, defult ‘max_step’</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Defines the nature of the steps to be generated, increasing or decreasing.
</code></pre></div> </div>
</li>
</ul>
<h3 id="returns">Returns</h3>
<p>steps : generator</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> generator of sequence.
</code></pre></div></div>
<h3 id="examples">Examples</h3>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>>>> step_gen = _generate_step(base_step=0.25, step_ratio=2,
num_steps=4, step='min_step')
>>> [s for s in step_gen()]
[0.25, 0.5, 1.0, 2.0]
</code></pre></div></div>
<h2 id="extrapolation">Extrapolation</h2>
<p>The steps thus generated with fixed step_ratio is then used to calculate the derivative with different h in the finite difference formula. In this manner we use extrapolation to get the correct answer.</p>
<p>Now these calculated derivatives needs to be analysed for least error estimation. Hence, the derivatives are convolved with appropriate coefficients to estimate the result. This method is also a part of Richardson Extrapolation. After convolution, errors are computed by subtracting adjacent values in the array of derivatives.</p>
<p>Then the median of errors are used to statiscally obtain the derivative with least error. Hence, statistical methods like percentile is used to obtain the majority of derivatives in a certain range.</p>
<p>This whole method can be found <a href="https://github.com/ashwinpathak20/scipy/blob/diff/scipy/diff/_derivative_numdiff.py">here</a>.</p>
<h2 id="hessian">Hessian</h2>
<p>Hessian is a square matrix of second-order partial derivatives of a scalar-valued function.</p>
<p><img src="https://wikimedia.org/api/rest_v1/media/math/render/svg/ceb2ef7133d4ffb011021db5f90126d42058378d" alt="Hessian" /></p>
<h2 id="status-of-week-3">Status of Week 3:</h2>
<p>The Hessian code can be found here :</p>
<p>numdifftools (hessdiag also present) <a href="https://github.com/ashwinpathak20/scipy/blob/diff/scipy/diff/_hessian_numdiff.py">here</a>
statsmodels <a href="https://github.com/ashwinpathak20/scipy/blob/diff/scipy/diff/_derivative.py">here</a></p>
<p>Input for Hessian is a univariate/multivariate function returning a scalar value, a 2d array of points at which hessian is to e calculated and other options for step generation. It returns hessian matrix : a 3d array with each row representing the hessian matrix corresponding to each input vector.</p>
<p>Example:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> >>> hessian(lambda x : x[0] + x[1]**2 + x[2]**3, [[1,1,1],[2,2,2]])
[[[ 0.00000000e+00 0.00000000e+00 0.00000000e+00]
[ 0.00000000e+00 2.00000000e+00 0.00000000e+00]
[ 0.00000000e+00 0.00000000e+00 6.00000000e+00]]
[[ 0.00000000e+00 0.00000000e+00 0.00000000e+00]
[ 0.00000000e+00 2.00000000e+00 2.64678414e-16]
[ 0.00000000e+00 2.64678414e-16 1.20000000e+01]]]
</code></pre></div></div>Ashwin Pathakashwinpathak20nov1996@gmail.comhttps://github.com/ashwinpathak20/ashwinpathak20.github.io/blob/master/files/Ashwin_Pathak_Resume___Aug.pdfThis blog is dedicated to the third week of Google Summer of Code (i.e June 16 - June 23). But first, a brief insight is given about how the code works.GSOC 2017 - Week 2 of GSoC 172017-06-15T00:00:00-07:002017-06-15T00:00:00-07:00https://ashwinpathak20.github.io/posts/2017/06/blog-post-4<p>This blog is dedicated to the second week of Google Summer of Code (i.e June 8 - June 15). The target of the second week according to my timeline was to implement the Jacobian and gradient using numdifftools.</p>
<h2 id="derivative">Derivative</h2>
<p>The derivative of a function of a single variable at a chosen input value, when it exists, is the slope of the tangent line to the graph of the function at that point.</p>
<p><img src="https://upload.wikimedia.org/wikipedia/commons/9/97/Introductory_Physics_fig_1.15.png" alt="Derivative in 2D" /></p>
<p>It is derived numerically as :</p>
<p><img src="https://wikimedia.org/api/rest_v1/media/math/render/svg/52f8a3ef721b7705ef72e10d29176a215b088584" alt="Derivative Formula" /></p>
<h2 id="gradient">Gradient</h2>
<p>Gradient is a multi-variable generalization of the derivative. While a derivative can be defined on functions of a single variable, for functions of several variables, the gradient takes its place. The gradient is a vector-valued function, as opposed to a derivative, which is scalar-valued. If f(x1, …, xn) is a differentiable, real-valued function of several variables, its gradient is the vector whose components are the n partial derivatives of f.</p>
<p>In 3D gradient can be given as:</p>
<p><img src="https://wikimedia.org/api/rest_v1/media/math/render/svg/4487f7230a0ac1b304bb022e6b1e211499c9f78e" alt="Gradient" /></p>
<h2 id="jacobian">Jacobian</h2>
<p>The Jacobian matrix is the matrix of all first-order partial derivatives of a vector-valued function. Given as:</p>
<p><img src="https://wikimedia.org/api/rest_v1/media/math/render/svg/74e93aa903c2695e45770030453eb77224104ee4" alt="Jacobian" /></p>
<h2 id="status-of-week-2">Status of Week 2:</h2>
<ul>
<li>
<h3 id="derivative-">Derivative :</h3>
<p>The code to the derivative can be found <a href="https://github.com/ashwinpathak20/scipy/blob/diff/scipy/diff/_derivative_numdiff.py">here</a>.
The code accepts a univariate scalar function and points at which the derivative needs to computed along with a dictionary for options related to step generation, order of derivative and order of error terms as an input. It returns a 1D array of derivative at respective points.
The method uses extrapolation and adaptive steps for derivative computation which leads to more accuracy as compared to statsmodels.</p>
<p>Example:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> Input : derivative((lambda x: x**2), [1,2])
Output : [2,4]
</code></pre></div> </div>
</li>
<li>
<h3 id="jacobian-">Jacobian :</h3>
<p>The code to the Jacobian can be found <a href="https://github.com/ashwinpathak20/scipy/blob/diff/scipy/diff/_jacobian_numdiff.py">here</a>.</p>
<p>The code accepts a function which can be multivariate and vector as well, vectors (in the form of 2D array) for Jacobian computation a dictionary for options related to step generation, order of derivative and order of error terms as an input. It returns a 2D array for scalar functions and 3D array for vector functions. Each row in the 3D array respresents the jacobian of each vector.</p>
<p>Example:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Input : jacobian(lambda x: [[x[0]*x[1]**2], [x[0]*x[1]]], [[1,2],[3,4]])
Output : [array([[ 4., 4.],[ 2., 1.]]), array([[ 16., 24.],[ 4., 3.]])]
</code></pre></div> </div>
</li>
<li>
<h3 id="gradient-">Gradient :</h3>
<p>Gradient is considered to be a special case of Jacobian and thus, Gradient returns the value computed by Jacobian.</p>
</li>
</ul>
<h2 id="next-weeks-target">Next Week’s Target:</h2>
<ul>
<li>Hessian Implementation using statsmodels and numdifftools.</li>
<li>Analysis about the accuracy of the functions that are already implemented.</li>
</ul>
<h2 id="references">References:</h2>
<ul>
<li>https://en.wikipedia.org/wiki/Derivative</li>
<li>https://en.wikipedia.org/wiki/Gradient</li>
<li>https://en.wikipedia.org/wiki/Jacobian_matrix_and_determinant</li>
<li>https://github.com/pbrod/numdifftools/tree/master/numdifftools</li>
</ul>Ashwin Pathakashwinpathak20nov1996@gmail.comhttps://github.com/ashwinpathak20/ashwinpathak20.github.io/blob/master/files/Ashwin_Pathak_Resume___Aug.pdfThis blog is dedicated to the second week of Google Summer of Code (i.e June 8 - June 15). The target of the second week according to my timeline was to implement the Jacobian and gradient using numdifftools.GSOC 2017 - Week 1 of GSoC 172017-06-07T00:00:00-07:002017-06-07T00:00:00-07:00https://ashwinpathak20.github.io/posts/2017/06/blog-post-3<p>This blog is dedicated to the first week of Google Summer of Code (i.e June 1 - June 7). The target of the first week according to my timeline was to get conversant with the code structure and implement the derivative using statsmodels and partly by numdifftools.</p>
<ul>
<li>
<h3 id="status-of-week-1-">Status of Week 1 :</h3>
<p>I tried to stick to the plan. I have concentrated only on finite differences, complex methods are for latter part of the project. I have implemented the following :</p>
<ul>
<li>
<h3 id="derivatives-using-statsmodels">Derivatives using statsmodels:</h3>
<p>I have implemented a vectorized code for derivatives using statsmodels along with docstrings and tests. It is present in scipy.diff.statsmodels._derivative.Derivaitve.</p>
</li>
<li>
<h3 id="gradient-using-statsdmodels">Gradient using statsdmodels:</h3>
<p>I have implemented a vectorized code for gradients using statsmodels along with docstrings and tests. It is present in scipy.diff.statsmodels._derivative.Gradient.</p>
</li>
<li>
<h3 id="jacobian-using-statsmodels">Jacobian using statsmodels:</h3>
<p>I have implemented a vectorized code for jacobians using statsmodels along with docstrings and tests. It is present in scipy.diff.statsmodels._derivative.Jacobian.</p>
</li>
<li>
<h3 id="derivatives-using-numdifftools">Derivatives using numdifftools:</h3>
<p>I was understanding the numdifftools and side by side trying to implement it. I have successfully implemented the code for derivatives, however, the code is not clean and needs modularization and refactoring. I will be doing this work in the second week.</p>
</li>
</ul>
</li>
<li>
<h3 id="updates">Updates:</h3>
<p>Link to the implemented code : <a href="https://github.com/ashwinpathak20/scipy/tree/diff/scipy/diff">here</a></p>
</li>
<li>
<h3 id="next-target">Next Target:</h3>
<p>Next week is dedicated to computation of derivatives, gradients and jacobian using finite differences from numdifftools.</p>
</li>
</ul>Ashwin Pathakashwinpathak20nov1996@gmail.comhttps://github.com/ashwinpathak20/ashwinpathak20.github.io/blob/master/files/Ashwin_Pathak_Resume___Aug.pdfThis blog is dedicated to the first week of Google Summer of Code (i.e June 1 - June 7). The target of the first week according to my timeline was to get conversant with the code structure and implement the derivative using statsmodels and partly by numdifftools.GSOC 2017 - Proposal and basic structure of API2017-05-28T00:00:00-07:002017-05-28T00:00:00-07:00https://ashwinpathak20.github.io/posts/2017/05/blog-post-2<p>This is the basic outline of the design of the scipy.diff module. I have thoroughly investigated the functioning of statsmodels and numdifftools. Along with it, scipy.misc.derivative has also been looked at. In scipy.diff I propose to have adaptations from numdifftools and statsmodels majorly due to their accurate results, sophisiticated techniques of computation and ease of accessibility.</p>
<ul>
<li>
<h3 id="platform-for-coding-">Platform for coding :</h3>
<p>As the base code of scipy is present in github and due to the wide usablitiy and easy to use features, I have planned to do the coding of the module in gitub under my forked scipy repository.</p>
</li>
<li>
<h3 id="numerical-differentiation-">Numerical Differentiation :</h3>
<p>For numerical differentiation, I will use the finite difference method along with Richardson extrapolation for better accuracy. Complex step method approximation will also be implemented at the latter part of the summer.</p>
<p>I have decided to implement the following:</p>
<ul>
<li>Derivative</li>
<li>Gradient</li>
<li>Jacobian</li>
<li>Hessian</li>
<li>Hessdiag</li>
</ul>
</li>
<li>
<h3 id="statsmodels">Statsmodels</h3>
<p>Statsmodels provide a faster amaths innd accurate results for the numerical differentiation for first differences. It also computes the derivatives using complex-step derivative approximations. The code is very simple and easy to understand. However, it is not modular and contains no classes which is the reason why it has to be refactored. The default method is forward differences which I think should be changed to central due to the higher accuracy of central method.
However, there is a concern related to the analyticity of a function while computing the numerical derivatives using complex steps. This issue needs to be discussed with the mentors and get suggestions from.
Statsmodels can be used to compute the Derivative, Gradient, Jacobian and Hessian.</p>
</li>
<li>
<h3 id="numdifftools">Numdifftools</h3>
<p>Numdifftools provides a higher accuracy as compared to stasmodels due to the use of Richardson extrapolation and use of adpative step sizes. However, due to looping, the methods are a bit slower as compared to statsmodels. The code is highly modular and implemented using OOP concepts. It provides a high range of options such as order of accuracy, order of differentiation, steps generator methods, etc:
Numdifftools uses adaptive step-size to calculate finite differences, but will suffer from dilemma to choose small step-size to reduce truncation error but at the same time avoid subtractive cancellation at too small values.</p>
</li>
<li>
<h3 id="structure-and-functions">Structure and Functions:</h3>
<p>I decided to make a fully modularized code using the OOP concept so that the code becomes easy to interpret and understand. The maintainablity also becomes easy.
Following is the basic set of inputs and the output for Derivative, Gradient, Jacobian, Hessian, Hessdiag</p>
<ul>
<li>
<h4 id="input">Input</h4>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>fun: Input function
function of one array.
step: float, array-like or StepGenerator object, optional
default can be min_step_generator.
method: {‘central’, ‘complex’, ‘multicomplex’, ‘forward’, ‘backward’}, optional
default will be central.
order: int, optional
defines the order of the error term in the Taylor approximation used. For ‘central’ and
‘complex’ methods, it must be an even number. defualt is 2.
n: int, optional
Order of the derivative. Default is 1.
tool: {‘numdifftools’,‘statsmodels’}, optional
default will be numdifftools.
*Note: This parts needs to be discussed with the mentors.
</code></pre></div> </div>
</li>
<li>
<h4 id="output">Output</h4>
</li>
</ul>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> der: ndarray
array of derivatives
</code></pre></div> </div>
<p>The module will contain the following files:</p>
<ul>
<li>extrapolation.py</li>
<li>derivative.py</li>
<li>gradient.py</li>
<li>jacobian.py</li>
<li>hessian.py</li>
<li>step_generator.py</li>
<li>tests (folder)</li>
<li>docs (folder)
However, this is subject to change as per the requirements.</li>
</ul>
</li>
<li>
<h3 id="project-timeline">Project Timeline:</h3>
<p>Project Timeline is same as my GSoC proposal (as of now) <a href="https://docs.google.com/document/d/1WQwpD4VU3cewBH99a_2-3CcTmhtbclgS6JR1lVXmcYA/edit?usp=sharing">link</a></p>
</li>
<li>
<h3 id="to-be-discussed">To be discussed:</h3>
<ul>
<li>Which tool (Statsmodels or Numdifftools) should be given a higher priority and when to adpot one method over other?</li>
<li>Should tolerence be used to choose one tool over other?</li>
<li>Is there any need to include naive implementations from scipy.misc?</li>
</ul>
</li>
</ul>
<p><strong><em>NOTE</em></strong> This proposal will be altered and evolved along according to the discussion with the mentors.</p>Ashwin Pathakashwinpathak20nov1996@gmail.comhttps://github.com/ashwinpathak20/ashwinpathak20.github.io/blob/master/files/Ashwin_Pathak_Resume___Aug.pdfThis is the basic outline of the design of the scipy.diff module. I have thoroughly investigated the functioning of statsmodels and numdifftools. Along with it, scipy.misc.derivative has also been looked at. In scipy.diff I propose to have adaptations from numdifftools and statsmodels majorly due to their accurate results, sophisiticated techniques of computation and ease of accessibility.GSOC 2017 - Implementation of scipy.diff2017-05-15T00:00:00-07:002017-05-15T00:00:00-07:00https://ashwinpathak20.github.io/posts/2017/05/blog-post-1<p>I have been selected for the GSOC 2017 under the umbrella organisation of Python Software Foundation - Scipy. The topic of my project is: implementation of a module : scipy.diff.</p>
<p>This project aims at implementing numerical derivative calculation methods which are a set of core scientific numerical tools that are currently missing in SciPy. There has been discussion (and general agreement) on creating a new scipy.diff sub-package starting with the numdifftools code and some code in statsmodels.</p>
<p>Please find a link to proposal for the same <a href="https://docs.google.com/document/d/1WQwpD4VU3cewBH99a_2-3CcTmhtbclgS6JR1lVXmcYA/edit?usp=sharing">here</a>.</p>Ashwin Pathakashwinpathak20nov1996@gmail.comhttps://github.com/ashwinpathak20/ashwinpathak20.github.io/blob/master/files/Ashwin_Pathak_Resume___Aug.pdfI have been selected for the GSOC 2017 under the umbrella organisation of Python Software Foundation - Scipy. The topic of my project is: implementation of a module : scipy.diff.