For some reason, on OSX 10.5 xdr_long actually takes an int instead of a long and xdr_u_long really takes an unsigned int when compiling for 64-bit. Here are the relevant lines from /usr/include/rpc/xdr.h:
extern bool_t xdr_long(XDR *, int *);
extern bool_t xdr_u_long(XDR *, unsigned int *);
extern bool_t xdr_long(XDR *, long *);
extern bool_t xdr_u_long(XDR *, unsigned long *);
I can only guess that they did this so that xdr_long and xdr_u_long take the same datasize whether compiling in 32-bit or 64-bit mode.
I include a patch here to slightly modify default_io.cpp to work around Apple's craziness and let gdl compile and run on 64-bit Apple Intel.
Log in to post a comment.