Re: [Xweb-developers] Reactivating development
Brought to you by:
peterbecker
|
From: Peter B. <pe...@pe...> - 2003-11-10 02:42:09
|
murphee (Werner Schuster) wrote: > Peter Becker wrote: > murphee wrote: > >>> http://www.angelfire.com/co/werners/developerzone/webbuilder.html >> >> Yes, that was one of the more advanced tries. And there was the >> Mozilla plugin and my own attempt. >> Your bus described in the presentation -- is that a Bean InfoBus? > > > Nope, but it was influenced by it InfoBus (and jEdits EditBus); > Webbuilder was designed to be very extensible and something like > that was necessary for that; What was the reason not to go with an InfoBus? I am currently trying to understand the Bean world and I wonder if I should use some of the ideas in XWeb itself. >>> the Xweb tree and also manipulate the build process (which is >>> realized with Ant); >> >> What exactly does Ant do in this? The deployment/upload bit is >> obvious, but I am not sure what else Ant could do. But that might be >> my lack of imagination :-) > > > Well, basically Xweb is thought of as a compiler that compiles websites > and Ant is used for the build process that does everything else; > preprocessing could involve... eg. updating the input files before > generating the websites; like updating the input files from a CVS > (the build file could be run by some cron job or be triggered by > CVS commits,...); > Another idea (useful for OpenSource projects) would be generating > Javadoc (or some other doc) and copying it into the finished website > (for example at the JOGI Project (referenced in my sig) I put the > Javadoc for the API online; but I always have to generate it > and copy it to the output dir manually); > Generally updating input files, like regenerating them from DB > sources, ... I don't see that as compelling, a little UNIX script would do the same. Of course being able to call XWeb from Ant is nice to create cross-platform solutions, but I don't see the need for a further integration. The Ant task as you described -- just called the main process from within Ant, seems quite appropriate. Maybe some extra options like configuring the baseURL or some parameters might be helpful. > Oh yeah... running JTidy before Xweb would be useful, if you want > to import non-XHTML input files (this would make XWeb more userfriendly, > as the user does not have to bother with converting their HTML files > to valid XHTML); I defintely want JTidy support within XWeb, it seems to be too important to need extra tools. The XHTML requirement for things like the generic stylesheet coming with the distribution is a bit of a drawback. > etc. etc. ... > > >>> Webbuilder is only a prototype, but if there is interest in the thing, >>> I could put it on Sourceforge and do some work on it again; >> >> I'd rather like to see it as a part of XWeb -- the masterplan has >> always been to replace NetObjects Fusion :-) > > > hmm... Netobjects Fusion... is that still on the market? Google finds it: http://www.netobjects.com/ -- if only they would know how to do a proper website :-) The version 7.5 is a bit higher than the one I used years ago (3.0). > Haven't heard > much about it lately; lets aim for... er.. the Macromedia thing now... > Dreamweaver or something....... Yeah, Dreamweaver seems to be the one used at the moment. Never tried it myself. > > > And I don't see enough resources > >> to run multiple frontends, but who knows -- sometimes people pop up >> out of a sudden. I'm easy -- if you want your own project that's >> fine, too. It just seems better for pulicity purposes to have just one. > > > Hmm... well, lets see, what happens to the other Frontend project; > > I don't really have much time (make that: no time) for updating > Webbuilder (although I've been planning to); I could set up the > project on Sourceforge and give you (and possibly others CVS access > to it)... or something like that; anything's better than leaving > the code to rot (a lot of work in there); I'm happy to put it into the XWeb CVS. I think that gives the code more chance to be found than yet another SF project without any activity. But it's your choice of course. And there is the licensing issue, at the moment XWeb is still public domain. The thing with licenses is that I don't really want to care, but sometimes I feel I should :-) > >> Input: two absolute paths, one the current position, one the target >> 1) strip both target and source of the common start >> 2) add ".." + File.separator n-times in front of the target, where n >> is the number of File.separators in the source > > > that should work, I guess; > >> Similarly and safer you could do that with File.getParent(), dump all >> parents on a stack, find the deepest common parent, then add ".."s >> when going down the source side and afterwards the directory names >> when going down the target side. > > > well, I tried to fix the bug some time ago... I think I changed the code > that generates the structure tree (that is copied into each XML file); > instead of adding the base url to each link, I prepended "../" to the > string for each new level (ie. the code works recursively, and I added > the "../" each time the code went down another recursion level > (= section); that should work, but it didn't really... hell knows > why; after some hours I got distracted and... well, I guess the code > still lies hidden somewhere; Part of the problem is that you don't really know what is in the @targetDir attributes -- it can be no directory change or multiple. The files themself do allow changing directories, too -- there is nothing enforcing the target location to be in the current directory. I often use this to hack the main page -- the overview page of the first section is sometimes "../index.html" in my sites. I think there were some other issues why relative URLs aren't that easy. But comparing the actual File-s should be safe if done right. Peter |