Greetings list.
I'm running into a bit of a conundrum. I'm calling into stat(2) and
statfs(2) on Mac operating systems and get back the old 32-bit structure
instead of the new 64-bit structure.
If I compile a test program in C I get the 64-bit structure with just
<sys/mount.h> included. When I use DEFCFUN I get the 32-bit structure.
Same for <sys/stat.h>. I'm working around it currently using #+, but
I'd like some help understanding what's going on so I can do a proper
implementation.
I read in the include files and man pages that the *64 functions are
temporary things while old code gets updated and that users should never
actually call them. In the include file they are different functions
(see below for statfs), but there is some preprocessor magic happening.
There's a pair of defines _DARWIN_USE_64_BIT_INODE and
_DARWIN_NO_64_BIT_INODE that you can set before doing the include to set
the default behavior, but CFFI doesn't do grovelling for function
definitions.
--8<---------------cut here---------------start------------->8---
int statfs(const char *, struct statfs *) __DARWIN_INODE64(statfs);
#if !__DARWIN_ONLY_64_BIT_INO_T
int statfs64(const char *, struct statfs64 *) __OSX_AVAILABLE_BUT_DEPRECATED(__MAC_10_5,__MAC_10_6,__IPHONE_NA,__IPHONE_NA);
#endif /* !__DARWIN_ONLY_64_BIT_INO_T */
--8<---------------cut here---------------end--------------->8---
It just caused me a bunch of confusion because I was getting junk out of
CONVERT-FROM-FOREIGN with :INODE64 in *FEATURES*, so now that I've
gotten it to work I'm curious.