-
Notifications
You must be signed in to change notification settings - Fork 113
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for efficient high accuracy convolution interpolation (up to 7th order accurate C11 continuous) in any number of dimensions #609
base: master
Are you sure you want to change the base?
Conversation
This change implements the following algorithm: "Cubic Convolution Interpolation for Digital Image Processing" by Robert G. Keys (1981) IEEE Transactions On Acoustics, Speech And Signal Processing. Its order of accuracy is between that of linear interpolation and cubic splines.
I updated the requested names. I added 3D boundary conditions and some smaller improvements.
Keeping specialized boundary and interpolation methods up to 3D to ensure speed
Both 3rd and 4th order convergence are options with 4th order being the default. Extrapolation is possible with boundary conditions, default being an error if attempting extrapolation.
The constructed interpolator is tested in 1D, 2D, 3D and 4D against random data at grid points, against linear mean values between grid points and for two extrapolation boundary conditions.
This allows for high accuracy (up to C11 continuity) interpolation in arbitrary dimensions, as long as the data is uniformly spaced for each dimension.
small bugfix
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #609 +/- ##
===========================================
- Coverage 87.07% 69.38% -17.69%
===========================================
Files 28 35 +7
Lines 1888 2375 +487
===========================================
+ Hits 1644 1648 +4
- Misses 244 727 +483 ☔ View full report in Codecov by Sentry. |
I submitted an early version of this previously, but ended up deleting the pull request (I wanted to create my own package). |
Some of this code looks generated or computed. Could you include the code or source for generating the values or code? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is some of this code machine generated? If so, could you elaborate how?
Also I wonder if it there might be a better method for evaluating some of the polynomials. Horners?
Yes, i can submit the Python script for deriving the polynomial coefficients when I get home in a couple of days. But as I already stated I simply generalized the methodology of the paper I cited. I used SymPy for generalizing these derivations. In theory this script can derive arbitrary degree polynomials, however, the coefficients will get even longer than they already are. So I thought stopping at 13th degree was sufficient (even though higher degrees would likely perform even better). |
Yes, I got assistance from Claude Sonnet 3.5, mostly for debugging code I wrote which didn't work initially. You are likely correct that this isn't the most efficient way to evaluate the polynomials. I welcome improvements. 😊 But this was also the reason I developed the precompute method which is very efficient, since it is based on a single sorted lookup and a dot product. Due to the lookup it is highly efficient and almost independent of the polynomial degree. So maybe it is worth considering deriving even higher degree polynomials in the future, as I only saw improvements with higher degrees, thats why I kept going. 😊 For the boundary condition I use a linear prediction, where I have 'presolved' the linear systems for the coefficients. This symbolic linear solve I did using Symbolics.jl. 😊 |
All of these coefficients are analytically /symbolically derived. I would not trust generative AI with such a task of exact math, that would be bound to go wrong. For this task, SymPy and Symbolics.jl are the ideal choices. 😊 |
I just want to demonstrate the improved frequency response when going from 3rd degree polynomial to my 13th degree polynomial kernel (this is part of my derivation script which I will upload soon). In this paper (10.1109/83.743854 ) they conclude that "... However, higher order schemes only yield marginal improvement, at an increased computational cost." What they forget is that they can include extra equations in the kernel to improve the frequency response and order of accuracy. In this paper the best cutoff slope achieved (at f=-0.5 Hz) is 2.538. My 13th degree cutoff slope at the same point is 3.8, and I can keep going higher without increasing the computational cost (if the time spent at the kernel precompute step can be neglected). |
This script can derive kernels of arbitrary accuracy
Now I have comitted the Python script I mentioned, which I wrote to derive the kernels. |
This script uses SymPy.jl to derive the linear prediction coefficients
I remembered incorrectly when I said I used Symbolics.jl for deriving the linear prediction coefficients. I should also mention the 'GaussianConvolutionKernel' which is in the 'convolution_kernels.jl'-file: |
Could you organize this as follows?
The purpose of this is so that we can have a reproducible environment to generate the coefficients. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please reorganize the generating code under a top level gen
folder as per my prior comment such that we have reproducible environments in which to execute the code.
Convolution-Based Interpolation
This PR adds support for convolution-based interpolation methods to Interpolations.jl.
Features
Basic Usage Example