From: Colin M. <co...@ch...> - 2009-05-08 01:27:03
|
Larry McVoy wrote: > Yeah, that's a limitation of living within tcl's retarded philosophy of > everything is a string. Larry, you're on record as saying that you're only interested in ASCII. I have good news for you: Tcl represents non-ASCII characters, in which you have no interest. So any application you write can detect a non-ASCII character and interpret it as NULL. There is no need to extend the Tcl set of denotatable values, since you can always choose a value outside your discursive domain. Tcl doesn't have a distinguished NULL value (or non-value), just as the language of arithmetic doesn't have a value which represents the result of evaluating 1/0. The fact that 1/0 doesn't yield a value in arithmetic is not a consequence of religious dogma, but a logical consequence of the conventional meaning of '1', '0' and '/'. To pretend that you're a martyr to NULL is as silly as pretending you would be burned at the stake for saying 'but it DOES have a value, it's INFINITY (and beyond)!' - not persecuted, just silly. It is certainly the case that the IEEE float and double domains have an Inf and a NaN value, but you will see that these values are readily distinguishable from the denotations of numerals in their respective domains. I think, given that you only use ASCII, and assuming you can control the inputs to your programs, you ought to take a leaf out of IEEE and use non-ASCII values to denote your exceptional values, just like the rest of us do. > It's a fine idea but go look at what the other > languages do and why and you look at the EIAS religion and shake your > head. Whatever, I'm not going to change any minds here on that, it's > like going to church and preaching evolution. > Wouldn't know, don't go to church. > So you can't do > > set foo undef > set bar "" > if {$foo eq $bar} { puts wrong } > But that does work. "undef" is not equal to "", and so you get predictable, useful and consistent behaviour. > But you can do > > if {[defined $foo]} { puts wrong } > > and that one works fine. Other than crazy games with undefined unicode > chars or whatever, you don't have much choice. > If you don't 'believe' in non-ASCII characters, as you have said you don't, then I think your applications are safe from heresy, and you can use *defined* but non-ASCII unicodes to represent your distinguished (non-) value NULL. Everyone's happy. Colin. |