"Faré" fahree@gmail.com writes:
OK, but why not use (cffi:foreign-type-size :pointer) instead of (cffi:foreign-type-size :int) to determine 32/64 bit OS?
I think (cffi:foreign-type-size :pointer) can always do the right thing than (cffi:foreign-type-size :int)
Wrong wrong wrong!
The -m32 or -m64 are both needed, because you may be running on a machine where the default distribution is of one type, but your Lisp process is of a different size!
Here at ITA, we casually compile 32-bit Lisp executable on 64-bit machines, and the reverse is not unconceivable either.
[ François-René ÐVB Rideau | Reflection&Cybernethics | http://fare.tunes.org ] When everything seems to be going against you, remember that the airplane takes off against the wind, not with it. -- Henry Ford
On 06/06/07, Chun Tian binghe.lisp@gmail.com wrote:
Hi, iolib developers
I'm not sure why need a (gcc-cpu-flags) function to detect a gcc compile flag (-m32/-m64) and use this to compile C files.
First, use (cffi:foreign-type-size :int) to guess is wrong at least on
amd64 Linux: (cffi:foreign-type-size :int) return 4 on amd64 Linux, so you guess wrong to 32-bit.
Second, if you guess wrong, a 64-bit Lisp process will can not load a 32-bit library.
If I disable this (gcc-cpu-flags), gcc with no -m32/-m64 can always do
the right thing on both 32 and 64-bit platform, and the Lisp process can load this library. (I'm just doing this on Debian GNU/Linux amd64 and LispWorks 5.0.2 Enterprise Edition for AMD64 Linux.) Am I right?