Standard Lisp on Common Lisp, old code available?

2013-02-26
2013-02-27
  • Rainer Joswig
    Rainer Joswig
    2013-02-26

    Hi,

    I've read that Eric Benson wrote support for Standard Lisp on top of Common Lisp. Years ago. It was supposed to run on top of KCL and Ibuki CL.

    I saw that mentioned in a very old German Lisp book, which I bought recently. The authors had a chapter about how to embed Standard Lisp in Common Lisp and referred to that code. They offered the code on a floppy disk - which I don't have.

    Does anybody know what happened to that code? I read that Ibuki offered Reduce as a product using their CL implementations many years ago.

    Reading that old Book (LISP, Fallstudien mit Anwendungen in der Künstlichen Intelligenz by Rüdiger Esser and Elisabeth Feldmar), I was tempted to experiment a bit. I took some of their code and some of Arthur Norman's VSL, wrote some of the missing bits and now have something like 60%-80% of what might be necessary to run raw Reduce. Some of the issues are more system building issues and how to integrate that into CL.

    I would be interested to look at an existing port, to see how some things were done. That might help me to fill in some missing bits.

    Regards,

    Rainer Joswig

     
    • Arthur Norman
      Arthur Norman
      2013-02-26

      On Tue, 26 Feb 2013, Rainer Joswig wrote:

      Hi,
      I've read that Eric Benson wrote support for Standard Lisp on top of Common
      Lisp. Years ago. It was supposed to run on top of KCL and Ibuki CL.
      This Historical Archives show that in 1993 in Reduce 3.6 there was a file
      src/clrend.red and with 3.4 and 3.5 there is slightly more evidence of
      somebody having tried that, but certainly back in the early 1990s the
      large amounts of extra capability that Common Lisp provided that was of no
      special value to Reduce was a resource embarassment. I believe that it has
      not been looked at much since then.

      I saw that mentioned in a very old German Lisp book, which I bought
      recently. The authors had a chapter about how to embed Standard Lisp in
      Common Lisp and referred to that code. They offered the code on a floppy
      disk - which I don't have.

      Does anybody know what happened to that code? I read that Ibuki offered
      Reduce as a product using their CL implementations many years ago.

      I would expect it to be easy to go through the list of Standard Lisp
      functions and define a version of each in terms of Common Lisp. By using
      the package system risk of name-clashes between different behaviours for
      the same name would be easy to avoid.... so if I felt the need to do that
      any time I would expect to just sit down and do it rather than need to
      find an old floppy disk! Getting most of Reduce working would mainly be a
      matter of hacking build scripts I rather suspect.

      Reading that old Book (LISP, Fallstudien mit Anwendungen in der Künstlichen
      Intelligenz by Rüdiger Esser and Elisabeth Feldmar), I was tempted to
      experiment a bit. I took some of their code and some of Arthur Norman's VSL,
      wrote some of the missing bits and now have something like 60%-80% of what
      might be necessary to run raw Reduce. Some of the issues are more system
      building issues and how to integrate that into CL.

      Yes. Both PSL and CSL have over the years both mutually agreed
      clarification or extension to what was documented in the Standard Lisp
      Report and the PSL vs CSL incompatibilities in Reduce have to a large
      extent been sorted out. Also each system has its own set of functions NOT
      blessed as part of Standard Lisp but used in their system building or to
      provide private extra features.

      I would be interested to look at an existing port, to see how some things
      were done. That might help me to fill in some missing bits.

      Regards,

      Rainer Joswig

      From a Reduce perspective it is great to have people interested in getting
      into system hacking. But with several Lisp variants already im play we may
      still want to ensure that compatibility and consistency had been VERY well
      tested before any ports to new and different Lisps were merged in as
      centrally supported, since they represent a potential growth in the work
      for compatibility testing of any new algebra code, of performance
      enhancements etc etc...

      I would however be interested to see even a partial subset of Reduce
      hacked to run on a modern Common Lisp to see what performance looks like
      these days. My perception from way back was that the Standard Lisp
      implementations that had bene built pretty well explicitly to be tuned for
      Reduce had real advantages, but the world outside may have caught up.

      Arthur

       
      • Rainer Joswig
        Rainer Joswig
        2013-02-26

        Hi Arthur,

        thanks for your answer!

        There are some tricky language implementation issues with some tradeoffs. I don't know how much of Standard Lisp support is really needed plus I might not be aware of semantics differences. A few points:

        • macros

        Common Lisp already provides macros and macro expansion. So, do we reuse this or implement macro expansion new for Standard Lisp? Reuse means that expanded code can have Common Lisp internals exposed. Internals which would be introduced via Common Lisp macros - for example symbols from other packages. I have been currently using CL's macro definition and expansion.

        • FASLs

        The SL/Reduce provides some form of FASLs. In this case it is preparsed and expanded Lisp code saved into a file. If this should work, I need to make sure that Common Lisp does not leak any datastructures which can't be printed and read as s-expressions. It then also should NOT leak an Common Lisp internals - since SL might not be able to parse it or print it. Additionally, are these FASLs really necessary? In Common Lisp I would compile a file to machine code and use that as a fasl or parse the original again - there is little speed advantage nowadays in preparsed/macroexpanded code.

        • reader

        One can reuse the CL reader and write reader macros for vectors and strings. But is that enough? Are there other differences (like in reading numbers?) Does it matter? If RLISP reimplements the reader, it might be okay to be able to just use a hacked CL reader and Reduce than later reads with its own RLISP and Reduce parser/reader.

        • putd / getd

        How much of that is actually needed. Does it need to record source code and return it? What should it tell about various functions? For example LAMBDA is in Common Lisp a macro - SL better not be aware of that.

        • small differences

        CL and SL have similar functions, but with slight differences. CAR, CDR and related all signal an error for NIL. Not in Common Lisp. SL:APPLY allows Lisp forms to be applied - not Common Lisp. SL:REVERSE reverses a list and returns NIL otherwise - CL:REVERSE signals an error when it sees a non-list. There are a lot more.

        I would however be interested to see even a partial subset of Reduce
        hacked to run on a modern Common Lisp to see what performance looks like
        these days. My perception from way back was that the Standard Lisp
        implementations that had bene built pretty well explicitly to be tuned for
        Reduce had real advantages, but the world outside may have caught up.

        It depends how much of CL and what CL gets used. Some CL implementations are more tuned to run compiled code. Calling a compiler at runtime might be costly/slow. So for such a CL implementation it would be necessary to reduce the amount of runtime compilation.

        If you are interested, I can send you my sources so far. Send me an emaiö to joswig at lisp.de . I have used it with SBCL and LispWorks. I would also like to make that available for Clozure CL and CLISP.

        Regards,

        Rainer Joswig

         
        • Arthur Norman
          Arthur Norman
          2013-02-26

          On Tue, 26 Feb 2013, Rainer Joswig wrote:

          Hi Arthur,

          thanks for your answer!

          There are some tricky language implementation issues with some
          tradeoffs. I don't know how much of Standard Lisp support is really
          needed plus I might not be aware of semantics differences. A few points:

          A reasonable place to start is that Standard Lisp was developed as a
          dialect that Reduce would use, and that was otherwise tolerably minimal.
          Thus to a very pleasible first approximation a behaviour or functionality
          is only present in Standard Lisp if it is needed for and used by Reduce.
          That will obviously not be a 100% true statement if you are looking at a
          full implementation of Reduce.

          I can not guarantee to speak for the PSL camp, but as for CSL the view I
          took about it was that at the time I was developing it direct use of Lisp
          was fading fast. Parts of the AI camp had defected to Prolog (or worse),
          Common Lisp on the machines of the day was infeasibly large and slow
          unless you were somewhere astonishingly well funded, and the specialist
          Lisp machines from Symbolics etc were not looking good long term bets. So
          my perspective was that Standard Lisp and in particular CSL were an
          intermediate abstraction level used by Reduce to provide arithmetic,
          storage management and I/O that "happened" to have a list-based internal
          representation for code that a Lisp 1.5 enthusiast would recognise. But
          note that all Reduce is written in rlisp not Lisp so the parenthesised
          syntax only shows up when you are in a few hundred lines of bootstrapping.
          So e.g. in anything I do in the Reduce world things like CLOS or the
          amazing LOOP macro or Common Lisp formatted printing or Common Lisp
          complex numbers, declarations, great emphasis on precise treatment of
          lexical scope and so on are all irrelevancies since most of them do not
          get exposed readily thorough rlisp (and in some cases such as the rlisp
          "for each" statement parts of the functionality happen at the parser
          level). And the richer range of Common Lisp types can not be properly used
          in Reduce code that anybody will ever wish to use with the existing
          Lisps...

          • macros

          Common Lisp already provides macros and macro expansion. So, do we
          reuse this or implement macro expansion new for Standard Lisp? Reuse
          means that expanded code can have Common Lisp internals exposed.
          Internals which would be introduced via Common Lisp macros - for example
          symbols from other packages. I have been currently using CL's macro
          definition and expansion.

          So if you just treat Standard Lisp (dm foo (u) XXX) as Common Lisp
          (defmacro foo (u) XXX) then Common Lisp might wrap all sorts of stuff
          around XXX in the definition, so anybody who does (getd 'foo) will find
          a whole pile of Common Lisp mess there. Gosh I try using "clisp" but I
          note that I bet that there is no guarantee as just what other Common Lisps
          would do...

          [1]> (defmacro foo (u) XXX)
          FOO
          [2]> (symbol-function 'foo)

          <MACRO

          #<FUNCTION FOO="" (SYSTEM::<MACRO-FORM=""> SYSTEM::<ENV-ARG>)
          (DECLARE (CONS SYSTEM::<MACRO-FORM>)) (DECLARE (IGNORE
          SYSTEM::<ENV-ARG>))
          (IF (NOT (SYSTEM::LIST-LENGTH-IN-BOUNDS-P SYSTEM::<MACRO-FORM> 2 2
          NIL))
          (SYSTEM::MACRO-CALL-ERROR SYSTEM::<MACRO-FORM>)
          (LET* ((U (CADR SYSTEM::<MACRO-FORM>))) (BLOCK FOO XXX)))>
          (U)>

          Well if present in a Reduce that was running on top of a Common Lisp then
          all the Common Lisp functions and variables and declarations would be
          available. But any bits of Reduce that retrieve the definition and try to
          analyse or process the code could get confused - eg by extra special
          forms... so eg the crossrefercer could get hurt, and rlisp/form.red might
          need careful review.

          So rlisp (mainly in form.red) applies Lisp macros at rlisp parse/analyse
          time so I HOPE that after that has been got through all macros have been
          expanded, and I hope that the effect of the expansion only introduces the
          stuff the user specified in the macro and nothing beyond that!

          • FASLs

          The SL/Reduce provides some form of FASLs. In this case it is preparsed
          and expanded Lisp code saved into a file. If this should work, I need to
          make sure that Common Lisp does not leak any datastructures which can't
          be printed and read as s-expressions. It then also should NOT leak an
          Common Lisp internals - since SL might not be able to parse it or print
          it. Additionally, are these FASLs really necessary? In Common Lisp I
          would compile a file to machine code and use that as a fasl or parse the
          original again - there is little speed advantage nowadays in
          preparsed/macroexpanded code.

          Try it and see! But note that loading files created using
          "faslout/faslend" will only happen on the same Lisp that created the
          files, so it is not SL that is reading stuff back - it is the Lisp you are
          using. The files need to be able to contain basically anything that could
          go in a Reduce source file, so much of it will be function definitions and
          indeed those end up in the fasl-file as some sort of compiled code. But
          there can be other Lisp stuff that is to be executed at load time to
          initialise variables etc etc etc. In my BABY Lisp vsl I just dump raw Lisp
          but CSL and PSL have distinctly more elaborate formats and compilation of
          the functions into either bytecodes or native code.

          As for speed consequences, I find it takes a couple of minutes to parse
          and compile all the Reduce source files on my computer, but only a matter
          of seconds to load the pre-digested fasl files.

          There is a separate issue to faslout/faslend in that Reduce typically
          expects there to be a way to build a working full Reduce and then
          checkpoint it so that when the system is (re-)started one has all the
          functions that had been established present and you start in the Reduce
          read-eval-print loop not any Lisp one. Such things used to be present in
          typical Common Lisp implementations but not in the guaranteed standard...
          So here you need to decide if you are interested in building Reduce on
          some particular Common Lisp implementation (using some of its extensions)
          or whether you want to see how far you can get if you are careful to
          restrict yourself to a subset of the full standard that just includes
          features believed to be implemented reliably on all the major versions...

          • reader

          One can reuse the CL reader and write reader macros for vectors and
          strings. But is that enough? Are there other differences (like in reading
          numbers?) Does it matter? If RLISP reimplements the reader, it might be
          okay to be able to just use a hacked CL reader and Reduce than later
          reads with its own RLISP and Reduce parser/reader.

          Mostly Reduce only uses the Lisp reader as such in a few hundred lines
          where a bootstrap implementation of Rlisp is read in. And that is liable
          to be pretty minimal and rather conservative. But when I scan the Reduce
          souces a find a few files in fide, misc, orthovec, rd, redfront that call
          the "READ" function and I would hate to speculate on whether there would
          be pain if they did not do what Standard Lisp expects. If I was concerned
          with a technology demonstrator, proof of concept, experimental bit of fun
          I would just ignore them to start with. If I wanted something that did all
          that Reduce does I would need to worry more - and that is the sort of
          thing that makes me happy that this is not something I am doing.

          • putd / getd

          How much of that is actually needed. Does it need to record source code
          and return it? What should it tell about various functions? For example
          LAMBDA is in Common Lisp a macro - SL better not be aware of that.

          Hahahah. My PERSONAL view is that many of the uses of getd and putd in the
          Reduce sources are a bad thing and that a version that did not rely on
          dynamic redefinition would be nicer, but the code is as it is. I count 202
          uses of getd and 39 of putd. But probably form.red and rcref.red (and
          possibly others) may notice when they see a macro and try to expand it. I
          have no idea whether this will matter. For a technology-demo just view
          rcref as "not the top priority" and get other things working!

          • small differences

          CL and SL have similar functions, but with slight differences. CAR, CDR
          and related all signal an error for NIL. Not in Common Lisp. SL:APPLY
          allows Lisp forms to be applied - not Common Lisp. SL:REVERSE reverses a
          list and returns NIL otherwise - CL:REVERSE signals an error when it sees
          a non-list. There are a lot more.

          Yes there are. Allowing (car nil) to return nil not an error ought not to
          hurt correct code, in that I HOPE no part of Reduce is coded to rely on
          that being an exception that could be trapped! MAP and friends take
          arguments in a different order. Standard Lisp has no character objects and
          uses symbols whose names are one character. SL (these days) thinks that
          symbols like car, cdr etc are spelt in lower case not old fashioned upper
          case. So yes, CL is the Lisp that generations of MIT (and their CMU
          offshoot) people worked on with an enthusiasm to produce a "big and
          powerful language" that would be used directly as Lisp. SL started as a
          subset of capabilities common to all tolerable Lisps and to my mind is
          both more firmly rooted in the old fashioned historical reading of Lisp
          (eg PLUS and TIMES rather than + and *) and perhaps in some ways it looks
          to the West Coast and certainly internationally rather than viewing MIT as
          the sole source of authority. But because it is a bit minimalistic
          Standard Lisp only specifies say 600 functions - a lot fewer than CL, so
          getting them all working is not too much work.

          Note that CSL starts off with an object list with only around 800
          nontrivial things on it. But vsl starts off with only 200 items on its
          object list but is neverthless enough to get enough of Reduce built that
          one can demonstrate it doing many of the standard test scripts. So if vsl
          is 3000 lines of C to implement a rather small Lisp then only a couple of
          thousand lines of Lisp code to built it up with the functionality needed
          to build and run (much of) Reduce (rather slowly) I rather bet that
          bodging something on top of an existing Common Lisp so as to demonstrate
          something would only be a tolerably modest bit of work even if finishing
          off everything would be much harder.

          So if I wanted to do this I would probably start with by looking at either
          psl/dist/util/*.sl or csl/cslbase/rlisp.lsp and just incrementally try to
          build a world in which one of those could be loaded and run to provide a
          subset/bootstrap rlisp parser. I believe that in general if you got that
          far (it is around 3K lines of Lisp) then the rest from there on would feel
          a lot smoother.

                  Arthur
          
           
          • Rainer Joswig
            Rainer Joswig
            2013-02-27

            A reasonable place to start is that Standard Lisp was developed as a
            dialect that Reduce would use, and that was otherwise tolerably minimal.

            Yes, makes sense. I did that, mostly.

            But note that all Reduce is written in rlisp not Lisp

            Yes, I saw that. I find the layering of languages 'interesting', and was hoping by writing a port for CL, I would be motivated to learn a bit about RLISP (or even RLISP 88) and then the Reduce language.

            so anybody who does (getd 'foo) will find a whole pile of Common Lisp mess there.

            Yes, all defining macros of CL implementations (defun, defmacro, ...) will expand into code that likely does a lot implementation specific things like recording sources, recording source locations, noting declarations, noting the code in the compile-time environment and so on.

            In my BABY Lisp vsl I just dump raw Lisp but CSL and PSL have distinctly more elaborate formats and compilation of the functions into either bytecodes or native code.

            Ah, okay. In Common Lisp I would use the function COMPILE-FILE to create such a fasl. But I fear it is not easily compatible with the way the fasl creating is done here in Standard Lisp. In Standard Lisp the interface looks like being able to compile from a stream of expressions. In Common Lisp there is no compile from stream, IIRC. One way around it might be to record all the expressions between fasl creation start and end, write those to a file and then call COMPILE-FILE to create the FASL.

            But vsl starts off with only 200 items on its
            object list but is neverthless enough to get enough of Reduce built that
            one can demonstrate it doing many of the standard test scripts.

            That's quite cool. I think it's great that you created the minimal 'Standard Lisp' for Reduce - it serves as a great example to learn from.

            So if I wanted to do this I would probably start with by looking at either
            psl/dist/util/*.sl or csl/cslbase/rlisp.lsp and just incrementally try to
            build a world in which one of those could be loaded and run to provide a
            subset/bootstrap rlisp parser. I believe that in general if you got that
            far (it is around 3K lines of Lisp) then the rest from there on would feel
            a lot smoother.

            I'll look at that. Currently I have taken the VSL Lisp code and moved some into CL and removed some (like the bignum code). It then executes the buildreduce.lsp (with some changes) up until it starts creating the xremake fasl. So the rlisp code, smacros, poly, alg, etc. is already being loaded.

            So the next task when I have time is to see how to deal with the FASL generation.

            Regards,

            Rainer Joswig