On Sat, Feb 13, 2010 at 5:06 PM, Tobias C. Rittweiler <firstname.lastname@example.org>
I found it hard to follow what you were saying for it only consisted of
words, and translating words to actual code is difficult. (It also does
not help that my concentration is low due to sickness.)
I will not write real code, but pseudocode, is it ok? The example we have in our hands has the following structure
;; In a different file, package FOO is created and uses CL
;; Forms executed in the compiler, when loading and when executing
1* unintern symbol IN-PACKAGE from FOO
2* create an internal symbol IN-PACKAGE in FOO
3* export FOO:IN-PACKAGE)
;; Only in code loaded
4* use symbol FOO::IN-PACKAGE
Note that on entrance to the file FOO does have a symbol named IN-PACKAGE that actually belongs to CL. Along the file this symbol is replaced. Only after statement 2* does the system know that the IN-PACKAGE symbol has changed. So when the form 4* is READ, FOO:IN-PACKAGE is interpreter as a symbol that has a home in the package FOO, not in CL.
So all this works because we have a side effect that affects the reader. And this side effect does what is expected when the source file is loaded. Hence the analogy with changing the reader table along a source file.
The question is, why should that side effect also take place in the compiled file?
A compiled file is stated to be a set of pre-parsed forms that contain constants and some internal representation (native or not) of the toplevel forms that are executed. Since they are pre-parsed, the compiler is free to have stored all constants in some efficient representation and reconstruct them according to the rules of similarity
The problem we have with the code above is that it will work differently depending on whether the constants are reconstructed at the beginning of the file, when FOO:IN-PACKAGE is similar to CL:IN-PACKAGE, or as forms are being executed, in which case FOO:IN-PACKAGE is a different symbol.
There is in my understanding nothing that forces an implementation to choose one order or another in the reconstruction of literal objects, and code that relies in a particular order is non-conformant.
There are many other examples of non-conformant code and I suspect that some is found in the above pointed library. For example, assume that we have a source file which has a single line:
(defvar *foo* 2)
There is no in-package form. Assume that when the code is compiled the value of *package* is "FOO". ECL will intern the symbol as FOO::*FOO* Now we have a potential conflict
1) The user sets *package* to "CL-USER" and loads the compiled file
2) The user keeps *package* as "FOO" and loads the compiled file
In both cases will ECL evaluate (defvar FOO::*FOO* 2) while other implementations may use (defvar CL-USER::*FOO* 2) or (defvar FOO::*FOO* 2) in each case.
Those assume that compiled code should behave exactly as loaded source code will find that the specification explicitly states that this is not conformant and that the user may not expect 1) and 2) to behave the same in source and compiled files.