It looks like the definition of lu-decomp (linear-algebra/lu.lisp) is incorrect.  Instead of (signum :int) it should read (signum :pointer), right?  My code (see below) works with this change, but not without it (it complains that the pointer isn't a fixnum).  However, signum is really just a return value of the function, depending on how high level this should all be, we may want to write a wrapper that allocates that memory and passes it back as a second value.  But then we would have to worry about memory leaks (since the GC won't free the memory, right?)  Further, this `function' really just modifies it's arguments.  I guess the Lisp way would be to change this into a function that returns the modified structures and mark it as a destructive function... It is hard for me to make Lisp style and C style coexist in my head.

How were the GSL definitions generated?  How did this :int :pointer confusion slip through?  If the point here is to make a system where mathematics can be done interactively, i.e. (invert-matrix #2A((1 0) (3 4))) ==> #2A(...), then a big part of this is going to be making wrapper functions that hide the C like nuts and bolts that make GSL a bit annoying to use in the first place.

(defun invert-matrix (mat)
  (gsl:letm ((mat2 (gsl:matrix-double mat))
             (per (gsl:permutation 2))
             (inv (gsl:matrix-double 2 2))
             (sig (cffi:foreign-alloc :int :count 1)) )
    (gsl:lu-decomp mat2 per sig)
    (gsl:lu-invert mat2 per inv)
    (cffi:foreign-free sig)
    (gsl:data inv) ))

How can I get rid of the explicit foreign-alloc/free?  Is this outside of the scope of GSLL (seems like a border case to me).

Thanks,
Zach