From: Johann D. <jo...@do...> - 2002-05-06 17:03:36
|
Rodrigo Damazio wrote: > Johann Deneux wrote: > >>>> The point is that input.h should include the header defining uint16_t. >>>> >>> I don't think so. The kernel routines using uint16_t should include >>> the kernel header <linux/types.h>, while the userspace routines >>> should use the standard header (<stdint.h>). >>> >> >> Perhaps a conditional include would do it ? I personally hate it when >> headers depending on other headers don't handle these dependencies by >> themselves, forcing the user to guess what could be the right thing to >> include in order to use some header. > > > Me too...I'm not exactly sure what uint16_t would conflict with > if it were defined outside the kernel, but the fact is that it is only > defined within kernel code(see asm/types.h)...About being No. I think you completely missed the point. uint16_t *is* a standard C99 type defined in <stdint.h>. Did you have a look at the chapter indicated by Brad about the use of types in Linux drivers ? See there: http://www.xml.com/ldd/chapter/book/ch10.html > kernel-specific, the linux/input.h is kernel-dependant too, so I guess > maybe it's ok to use __u16?? Sure, it will still work. > How about making it more practical - will everything work if we > leave __u16?? 'cause non-kernel programs(such as fftest.c) have trouble > compiling with uint16_t...I don't see a reason not to leave it __u16 if > it works fine... The argument would be "Why use the kernel-specific standard if there already is a C99 standard for this type ?". The danger would be that a userland developer notices these __u16 types and starts using them in other places in his program. Later, he needs an unsigned 16-bits int. What type will he use ? He could be mislead into using __u16. The problem will arise when he wants to port his software to some other OS. __u16 won't be there. uint16_t will. We could warn people and tell them not to use __u16. A better solution is not to show it at all. Btw, I wonder what were hackers thinking when they decided to use u16 for kernel-specific values and __u16 for public interfaces. Saying it's counter-intuitive is quite an understatement. -- Johann Deneux |