Recall in the previous post our derivation of the Gauss circle recurrence relation generalized to arbitrary dimensions, scaling, offset, and weights:
where the cost of computing for for max-distance requires flops. In practice, the summation should be transposed so that is incremented within the inner loop as to take advantage of vectorization/SIMD memory access patterns.
A second approach for those familiar with DSP utilizes the fact that the summation term in the recurrence relation passes as a form of sparse convolution due to the quadratic-striding memory access patterns of . To make the convolution operation explicit, we can vectorize into a sparse signal and re-express the summation as follows:
where is an indicator variable. Note that in consideration of building , a larger boundary scaling term increases the signal sparsity. i.e. from a computational perspective, there are regions in the parameter space of where either direct sparse-convolution or the direct evaluation of the recurrence relation will be faster than an implementation dependent Fast Fourier Transform (FFT) based convolution despite the latter’s lower asymptotic cost of flops.
With the distance bound and Gauss circle problem out of the way, we will continue our investigation of lattice volume bounds in terms of the or taxi-cab distance in the next post.