-/* This is one of those 'mathemeticians should not write code' kind of
- cases. Newton's method of polishing roots is straightforward
- enough... except in those cases where it just fails in the real
- world. In our case below, we're worried about a local mini/maxima
- shooting a root estimation off to infinity, or the new estimation
- chaotically oscillating about convergence (shouldn't actually be a
- problem in our usage.
-
- Maehly's modification (zero suppression, to prevent two tenative
- roots from collapsing to the same actual root) similarly can
- temporarily shoot a root off toward infinity. It would come
- back... if it were not for the fact that machine representation has
- limited dynamic range and resolution. This too is guarded by
- limiting delta.
-
- Last problem is convergence criteria; we don't know what a 'double'
- is on our hardware/compiler, and the convergence limit is bounded
- by roundoff noise. So, we hack convergence:
-
- Require at most 1e-6 mean squared error for all zeroes. When
- converging, start the clock ticking at 1e-6; limit our polishing to
- as many more iterations as took us to get this far, 100 max.
-
- Past max iters, quit when MSE is no longer decreasing *or* we go
- below ~1e-20 MSE, whichever happens first. */
-
-static void Newton_Raphson_Maehly(float *a,int ord,float *r){
- int i, k, count=0, maxiter=0;
- double error=1.,besterror=1.;
- double *root=alloca(ord*sizeof(double));
-
- for(i=0; i<ord;i++) root[i] = 2.0 * (i+0.5) / ord - 1.0;
-
- while(error>1.e-20){
+
+/* for spit-and-polish only */
+static int Newton_Raphson(float *a,int ord,float *r){
+ int i, k, count=0;
+ double error=1.f;
+ double *root=alloca(ord*sizeof(*root));
+
+ for(i=0; i<ord;i++) root[i] = r[i];
+
+ while(error>1e-20){