|
From: Robert D. <rob...@gm...> - 2022-10-18 17:22:32
|
On Tue, Oct 18, 2022 at 9:08 AM Leo Butler <Leo...@um...> wrote: > Developers who don't want to spend time working around gcl's > deficiencies shouldn't. I believe that several have already stated that > they do not test their work on gcl builds. > OTOH, I would like to have a minimally functional maxima+gcl. If that > means we add a cautionary bannerline about unsupported features, let's > do it. No, we should not do that. Supported has to mean that you get a working system. If various features don't work, then we'll get bug reports or other complaints from users about it. At that point telling people they have to back up and try it again with a different implementation, or just live with the defects, are both unsatisfying resolutions. In addition, if Maxima can be built with GCL (producing a sort-of-working system), then there is a pretty good chance someone packaging Maxima for distribution is going to do that. The way to prevent that is to just stop supporting it altogether. There's all kinds of #+gcl #-gcl all over the code, and frankly it's just a lot of dross. We can't get rid of it unless we agree that we're just not supporting GCL altogether. > The problem, as I have experienced repeatedly, is that gcl > will finish big computations while others barf (sbcl), take toooo > long (clisp) or some combination thereof. Hmm. My also-anecdotal experience is that GCL runs out of memory in cases where other Lisps don't. I would ask for examples, but to be honest it's secondary to the other issues about features lacking in GCL. FWIW Robert |