You can subscribe to this list here.
| 2000 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
(103) |
Jul
(105) |
Aug
(16) |
Sep
(16) |
Oct
(78) |
Nov
(36) |
Dec
(58) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2001 |
Jan
(100) |
Feb
(155) |
Mar
(84) |
Apr
(33) |
May
(22) |
Jun
(77) |
Jul
(36) |
Aug
(37) |
Sep
(183) |
Oct
(74) |
Nov
(235) |
Dec
(165) |
| 2002 |
Jan
(187) |
Feb
(183) |
Mar
(52) |
Apr
(10) |
May
(15) |
Jun
(19) |
Jul
(43) |
Aug
(90) |
Sep
(144) |
Oct
(144) |
Nov
(171) |
Dec
(78) |
| 2003 |
Jan
(113) |
Feb
(99) |
Mar
(80) |
Apr
(44) |
May
(35) |
Jun
(32) |
Jul
(34) |
Aug
(34) |
Sep
(30) |
Oct
(57) |
Nov
(97) |
Dec
(139) |
| 2004 |
Jan
(132) |
Feb
(223) |
Mar
(300) |
Apr
(221) |
May
(171) |
Jun
(286) |
Jul
(188) |
Aug
(107) |
Sep
(97) |
Oct
(106) |
Nov
(139) |
Dec
(125) |
| 2005 |
Jan
(200) |
Feb
(116) |
Mar
(68) |
Apr
(158) |
May
(70) |
Jun
(80) |
Jul
(55) |
Aug
(52) |
Sep
(92) |
Oct
(141) |
Nov
(86) |
Dec
(41) |
| 2006 |
Jan
(35) |
Feb
(62) |
Mar
(59) |
Apr
(52) |
May
(51) |
Jun
(61) |
Jul
(30) |
Aug
(36) |
Sep
(12) |
Oct
(4) |
Nov
(22) |
Dec
(34) |
| 2007 |
Jan
(49) |
Feb
(19) |
Mar
(37) |
Apr
(16) |
May
(9) |
Jun
(38) |
Jul
(17) |
Aug
(31) |
Sep
(16) |
Oct
(34) |
Nov
(4) |
Dec
(8) |
| 2008 |
Jan
(8) |
Feb
(16) |
Mar
(14) |
Apr
(6) |
May
(4) |
Jun
(5) |
Jul
(9) |
Aug
(36) |
Sep
(6) |
Oct
(3) |
Nov
(3) |
Dec
(3) |
| 2009 |
Jan
(14) |
Feb
(2) |
Mar
(7) |
Apr
(16) |
May
(2) |
Jun
(10) |
Jul
(1) |
Aug
(10) |
Sep
(11) |
Oct
(4) |
Nov
(2) |
Dec
|
| 2010 |
Jan
(1) |
Feb
|
Mar
(13) |
Apr
(11) |
May
(18) |
Jun
(44) |
Jul
(7) |
Aug
(2) |
Sep
(14) |
Oct
|
Nov
(6) |
Dec
|
| 2011 |
Jan
(2) |
Feb
(6) |
Mar
(3) |
Apr
(2) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2012 |
Jan
(11) |
Feb
(3) |
Mar
(11) |
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
(4) |
Dec
|
| 2013 |
Jan
|
Feb
|
Mar
|
Apr
(3) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2014 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(4) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
(8) |
Dec
(1) |
| 2015 |
Jan
(3) |
Feb
(2) |
Mar
|
Apr
(3) |
May
(1) |
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
(2) |
| 2016 |
Jan
|
Feb
(4) |
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2017 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(3) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(3) |
Jun
(1) |
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2020 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(3) |
Jun
|
Jul
(5) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2021 |
Jan
|
Feb
(4) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
(6) |
Sep
(3) |
Oct
|
Nov
|
Dec
|
| 2022 |
Jan
(11) |
Feb
(2) |
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2023 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
(3) |
Dec
(3) |
| 2024 |
Jan
(7) |
Feb
(2) |
Mar
(1) |
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2025 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
(1) |
Jun
|
Jul
(3) |
Aug
|
Sep
(5) |
Oct
|
Nov
|
Dec
|
|
From: Adam S. <la...@sp...> - 2001-06-13 02:59:00
|
> Aye, I'm not a great fan of SociaWikis, mostly because I don't think > they work very well for that purpose (I'm an email guy). For > collaborative documentation and development of a KnowledgeNase Wikis > work exceptionally well. What I'm trying to do here is to marry a > Wiki, mailing list archives and a K5-ish weblog. Early screenshot: GOD YES! i'm an email guy as well and have been thinking about this for a long time. i'm not much of a coder but i've been a unix admin since '93 so i'm fairly clued. :) if you want another set of eyes and someone to install it and give feedback i'd *LOVE* to help with this. i assume you're writing it in php? are you using a database back end? > Oh yeah the comments are individual WikiItems as well (I've got > HeirarchialWikiItems ala WikiItem/SubItem/SubItem with individual > items able to appear in multiple places in the tree (abstract views), > as well as the ability for a Item to embed/nest the contents of > another WikiItem inside itself at display time (think quotes)). this is almost exactly what i was thinking as well. the more i thought about it the more i decided that it really all the important tech was there and it was largely a ui design issue. how do you merge these different forms of content management together for maximum results. i'd love to see what you've done. adam. |
|
From: Steve W. <sw...@pa...> - 2001-06-11 02:44:25
|
Thanks for the tour, Joel! On Sun, 10 Jun 2001, Joel Uckelman wrote: > So, here's what looking around has suggested to me: > > 1. Should changes be stored incrementally, and old pages built from them? > Or should diffs be calculated from stored pages? The former conserves > storage at the expense of CPU cycles, the latter the opposite. Or maybe > incremental diffs and old pages should both be stored, period? This seems > like a basic design decision, rather than something that would be > user-configurable. I have a core belief about Wikis: they do not scale. They do not scale in terms of the number of users, that is. I think a Wiki can hold about as many active users as a mailing list; once you go beyond a certain number of daily posters to a mailing list, it's no longer feasible to read that mailing list. There's too much information and too low a signal to noise ration. So conserving cpu cycles is a non-issue (relativley... we are not going to start using bubble sorts!) Storage space, on the other hand, will only get cheaper and more vast as time goes on. So we will store whole pages and calculate diffs as needed. I think diffs are not needed much anyway. (I mean, there is a low demand for them vs. other features users use, like search. Diff is critically important to have though). My favorite diff implementation of all time is actually not in a Wiki but would serve as a good model for us to emulate: http://cvs.sourceforge.net/cgi-bin/viewcvs.cgi/phpwiki/phpwiki/README.diff?r1=text&tr1=1.6&r2=text&tr2=1.8&diff_format=h I like the side-by-side comparison and the color coding. I've found this incredibly helpful when doing development. You can play with a tool called "sdiff" on a lot of Unix systems too, which will give you a simple monochrome side by side diff. > 2. Buttons/links associated with versioning (and everything else!) should > have obvious functions, or at least ones that can be determined after a few > tries. Duh. MoinMoin was a particularly egregious offender in this regard. > > 3. Diffs should be easy to read, but not so sparse as to omit useful > information. I like the way diffs are done in phpwiki right now. Good job, > guys. All credit to Jeff! I doubt we'd even have diffs without his commitment and amazing hacking abilities. > 5. Ability to access all revisions from a single page is a Good Thing. > Having to click a zillion times to get to the one you want is not. Again we could copy viewcvs.cgi: http://cvs.sourceforge.net/cgi-bin/viewcvs.cgi/phpwiki/phpwiki/README Rather than clutter the reading page with a million links to do things, I think we're better off just having the "diff' link go to a page like the one above. > 6. The major/minor revision distinction would provide a good way to tag > pages that were altered, e.g., to correct typos. It could also be used by > someone malicious who wanted to draw attention away from revisions, though. > This, like all things Wiki, would depend on the goodwill of the user. With user auth, this problem would be rare, one would hope. > 7. How exactly does the current version of phpwiki decide what to archive? > I've noticed pages with archiving for the immediately previous version, a > version more than one change out of date, and no previous versions at all. > What's the deal? It goes like this: If you edit the page and you are not the previous author, the old version is archived. You can continue to edit the page forever after and it will not archive another version (as long as you come from the same IP address, which is how we track the author). This has turned out to be a problem for some people who are the sole users of the Wiki. This algorithm was taken right from the WikiWikiWeb itself. I think we'll go with the MeatBall approach of the "do not archive, these are small changes" option. Better to archive minor changes than to lose information. ~swain --- http://www.panix.com/~swain/ "Without music to decorate it, time is just a bunch of boring production deadlines or dates by which bills must be paid." -- Frank Zappa |
|
From: J C L. <cl...@ka...> - 2001-06-10 23:46:42
|
On Sun, 10 Jun 2001 14:39:39 -0700 (PDT) Adam Shand <la...@sp...> wrote: >> I tend to the view that forgiveness is a human construct and is >> therefore for humans, not machines, or if you wish, the dialectic >> opposite of SunirShah's parting comment. > i mostly agree with you. what i think is interesting is the > attempt to make a collaborative environment (in this case a wiki) > match a human social environment. groups of humans usually > forgive and forget over time (barring major transgressions), i > think the idea of making a computer forgive and forget is > potentially interesting. Its interesting but I think also an intellectual tar baby. As soon as you go down that road you are codifying personal human evaluations, which means that they are no longer personal but are now shared cultural expectations which are enforced on the individual beyond his real ability to counter. Aiiiie. >> And yes, I'd be a fan of WayBackMode (added to list of things to >> implement here along with per user PageTagging). > this would certainly be preferable for more technical or work > oriented wiki's. i'm not sure it would be of community based > wiki's. Aye, I'm not a great fan of SociaWikis, mostly because I don't think they work very well for that purpose (I'm an email guy). For collaborative documentation and development of a KnowledgeNase Wikis work exceptionally well. What I'm trying to do here is to marry a Wiki, mailing list archives and a K5-ish weblog. Early screenshot: http://www.kanga.nu/~claw/wikitest.png Oh yeah the comments are individual WikiItems as well (I've got HeirarchialWikiItems ala WikiItem/SubItem/SubItem with individual items able to appear in multiple places in the tree (abstract views), as well as the ability for a Item to embed/nest the contents of another WikiItem inside itself at display time (think quotes)). > however until someone implements it we'll never know :) There is that. -- J C Lawrence cl...@ka... ---------(*) http://www.kanga.nu/~claw/ The pressure to survive and rhetoric may make strange bedfellows |
|
From: Adam S. <la...@sp...> - 2001-06-10 21:39:43
|
> I tend to the view that forgiveness is a human construct and is > therefore for humans, not machines, or if you wish, the dialectic > opposite of SunirShah's parting comment. i mostly agree with you. what i think is interesting is the attempt to make a collaborative environment (in this case a wiki) match a human social environment. groups of humans usually forgive and forget over time (barring major transgressions), i think the idea of making a computer forgive and forget is potentially interesting. > And yes, I'd be a fan of WayBackMode (added to list of things to > implement here along with per user PageTagging). this would certainly be preferable for more technical or work oriented wiki's. i'm not sure it would be of community based wiki's. however until someone implements it we'll never know :) adam. |
|
From: J C L. <cl...@ka...> - 2001-06-10 18:00:52
|
On Sun, 10 Jun 2001 09:36:31 -0700 (PDT) Adam Shand <la...@sp...> wrote: >> Given the plumetting cost/MB I'd be more interested in >> implementing a hierarchial storage system where sufficiently old >> and non-current versions of content is moved to a secondary >> backing store, away from the primary copy, yet still accessible. > there are some interesting thoughts on the subject here: > http://www.usemod.com/cgi-bin/mb.pl?ForgiveAndForget I tend to the view that forgiveness is a human construct and is therefore for humans, not machines, or if you wish, the dialectic opposite of SunirShah's parting comment. I prefer humans to discriminate based on the full set of available data, not a partial set formed by an arbitrary and unexaminable (due to be unrecorded) editing of the data set. The problem then is data mining and establishing and maintaining ValuedInterpretations -- the things humans are actually good at. And yes, I'd be a fan of WayBackMode (added to list of things to implement here along with per user PageTagging). >> Heck, 20Gig of RW SCSI II is less than $200 no, even less if >> you're silly enough to go with IDE on a server. > ack! a scsi bigot! :) It works. -- J C Lawrence cl...@ka... ---------(*) http://www.kanga.nu/~claw/ The pressure to survive and rhetoric may make strange bedfellows |
|
From: Adam S. <la...@sp...> - 2001-06-10 16:36:32
|
> Given the plumetting cost/MB I'd be more interested in implementing a > hierarchial storage system where sufficiently old and non-current > versions of content is moved to a secondary backing store, away from > the primary copy, yet still accessible. there are some interesting thoughts on the subject here: http://www.usemod.com/cgi-bin/mb.pl?ForgiveAndForget > Heck, 20Gig of RW SCSI II is less than $200 no, even less if you're > silly enough to go with IDE on a server. ack! a scsi bigot! :) adam. |
|
From: Reini U. <ru...@x-...> - 2001-06-10 11:33:24
|
I've put this very good review onto http://phpwiki.sourceforge.net/phpwiki/index.php?WikiVersioning Good work! |
|
From: Joel U. <uck...@no...> - 2001-06-10 07:23:25
|
In an effort to see what versioning features are embodied in other Wikis, I went to the list of Wiki implementations (http://c2.com/cgi/wiki?WikiEngines ) and took the grand tour. Here's what I found. There aren't a lot of Wikis that have versioning; I made some notes about the ones that do. The notes became shorter as I went, mainly because I started seeing the same features repeated in the latter Wiki clones. Here goes: TWiki http://TWiki.org/cgi-bin/view/Main/WebHome Diffs are hard to read, too cluttered. The version links at the bottoms of pages provide quick access to the four most recent page versions as well as diffs from one to the next. The ">.." link will allow you to retrieve any version, as well as show diffs between any two versions. These diffs are incremental, though, e.g., if you want to see the diff between 1.30 and 1.1, you get all of the diffs in between, not a nice single diff. UseMod Wiki http://www.usemod.com/cgi-bin/wiki.pl "View Other Versions" link leads to a list of archived page versions, for which there are links to view full pages and diffs. Diffs are clean, easy to read. Diffs are always against current version, which is obnoxious if what you want is a diff between two non-current pages; moreover, this makes diffs with older pages just about useless, because the diffs are so huge. A neat feature is the way major and minor revisions are handled. The edit page has a checkbox for designating an edit a minor revision, in which case it's noted as such in the revision list. Also, there's something called an "author diff", the function of which I have yet to determine. Why Clublet http://clublet.com/c/c/why?page=HomePage I don't like their versioning system at all. There are VCR-like buttons for moving between versions, but apparently going from version 900 back to 500 would require 400 clicks. Also, the diff mechanism is too austere for my tastes---it shows you what's new, but not what's gone. The "gold bar" for marking new material isn't a bad idea, but then how do you mark something that has disappeared? Web Macro http://wiki.webmacro.org/WebMacro Previous versions are accessed via a "View previous version" link at page bottom which decrements the page version by one. Same too-many-clicks problem as Clublet above. No diffs. Wikki Tikki Tavi http://tavi.sourceforge.net/WikkiTikkiTavi "View document history" links to a version listing much like the Use Mod one. Diff is attractive, easy to read for someone who doesn't know anything about diffs. Any two versions may be checked to see a diff between them. Does this mean the diffs are generated on the fly? (If not, the number of stored diffs would grow exponentially.) The "Compute Difference" button makes me think so but I can't find it on their site. MoinMoin http://www.python.org/cgi-bin/moinmoin Version links are tiny little icons in the upper right. It is not obvious what they do on first glance. The "Info" one contains a table of revisions. The diffs are all against the current version. Color coding is a good thing, I think. PikiePikie http://pikie.darktech.org/cgi/pikie?PikiePikie Previous versions are accessible via numbered links at the bottom of the page. Probably not a good solution if there are lots of revisions. No diffs. Swiki http://pbl.cc.gatech.edu/myswiki/ Uses strikeouts for diffs, not overly easy to read. (Little icons are neat, though). Previous versions are available in full, or as diffs with immediately prior and later versions. Button interface and large spacing is not overly attractive, and it is not obvious what buttons will do when viewing diffs. WikiWorks http://wiki.cs.uiuc.edu/VisualWorks Diffs are poor. History displays as a table, with links to old versions. END OF TOUR So, here's what looking around has suggested to me: 1. Should changes be stored incrementally, and old pages built from them? Or should diffs be calculated from stored pages? The former conserves storage at the expense of CPU cycles, the latter the opposite. Or maybe incremental diffs and old pages should both be stored, period? This seems like a basic design decision, rather than something that would be user-configurable. 2. Buttons/links associated with versioning (and everything else!) should have obvious functions, or at least ones that can be determined after a few tries. Duh. MoinMoin was a particularly egregious offender in this regard. 3. Diffs should be easy to read, but not so sparse as to omit useful information. I like the way diffs are done in phpwiki right now. Good job, guys. 4. Computing diffs between arbitrary versions on the fly (as in Wikki Tikki Tavi) is neat, and wouldn't be that resource intensive, since how often would anyone actually use that anyway? Of course, that may also be an argument against implementing it in the first place. 5. Ability to access all revisions from a single page is a Good Thing. Having to click a zillion times to get to the one you want is not. 6. The major/minor revision distinction would provide a good way to tag pages that were altered, e.g., to correct typos. It could also be used by someone malicious who wanted to draw attention away from revisions, though. This, like all things Wiki, would depend on the goodwill of the user. 7. How exactly does the current version of phpwiki decide what to archive? I've noticed pages with archiving for the immediately previous version, a version more than one change out of date, and no previous versions at all. What's the deal? -- J. |
|
From: J C L. <cl...@ka...> - 2001-06-10 07:17:41
|
On Sat, 9 Jun 2001 20:48:24 -0700 (PDT) Adam Shand <la...@sp...> wrote: > the archiving scheme i'd like to see used is the one described > here: Given the plumetting cost/MB I'd be more interested in implementing a hierarchial storage system where sufficiently old and non-current versions of content is moved to a secondary backing store, away from the primary copy, yet still accessible. Heck, 20Gig of RW SCSI II is less than $200 no, even less if you're silly enough to go with IDE on a server. -- J C Lawrence cl...@ka... ---------(*) http://www.kanga.nu/~claw/ The pressure to survive and rhetoric may make strange bedfellows |
|
From: Adam S. <la...@sp...> - 2001-06-10 04:04:06
|
> http://www.usemod.com/cgi-bin/mb.pl?KeptPages > There's nothing at the one you gave. sorry you are correct, i must have fat fingered the cut and paste of the url. adam. |
|
From: Joel U. <uck...@no...> - 2001-06-10 03:57:42
|
Quoth Adam Shand: [snip] > the archiving scheme i'd like to see used is the one described here: > > http://www.usemod.com/cgi-bin/mb.pl?KeptPage Is this the URL you wanted? http://www.usemod.com/cgi-bin/mb.pl?KeptPages There's nothing at the one you gave. -- J. |
|
From: Adam S. <la...@sp...> - 2001-06-10 03:48:25
|
> Just let me get this properly: we only store two versions of each > page, and writing from the same host (taking into account time > interval?) is considered to be a change that replaces the current > version? My personal experience with phpwiki indicated it's true > (which means I need to start backing stuff up today :-) but I was > under the impression we had levels of undo like we have in swiki. the archiving scheme i'd like to see used is the one described here: http://www.usemod.com/cgi-bin/mb.pl?KeptPage basically old versions are kept for a fixed amount of time rather then a fixed number of copies. the main advantage of this is that if a malicious user comes in and starts trashing a page (or pages) you always have X amount of time to recover old data. i'd think that a month would be reasonable though it would probably need to be tuned for high volume sites. sorry if this has alrady been discussed. adam. |
|
From: Jeff D. <da...@da...> - 2001-06-10 00:18:11
|
I've started a page in the phpwiki wiki on the various PHP RDBMS abstraction libraries. I haven't looked very deeply at any of the choices yet, but feel free to add your comments: http://phpwiki.sourceforge.net/phpwiki/index.php?PhpDatabaseAccessLibraries Jeff |
|
From: Christian R. R. <ki...@as...> - 2001-06-09 22:32:51
|
On Sat, 9 Jun 2001, Steve Wainstead wrote: > The downside is dbx has to be compiled in, of course. I suppose this is > the tradeoff we face: PhpWiki 1.4 could use such a database abstraction, > and that makes it work with more databases; but the users have to have the > abstraction layer compiled in. I wonder why the PHP maintainers don't > automatically add dbx when any database supported by dbx is compiled in? Of course, the other downside is that it's currently only available in CVS PHP4 and as such is virtually non-deployed :-) Take care, -- /\/\ Christian Reis, Senior Engineer, Async Open Source, Brazil ~\/~ http://async.com.br/~kiko/ | [+55 16] 274 4311 |
|
From: Christian R. R. <ki...@as...> - 2001-06-09 22:26:37
|
On Sat, 9 Jun 2001, Sandino Araico S=E1nchez wrote: > Instead of using foreach you can pass the array as parameter to strtr(). = Look > for details in the strtr() manual. Quite correct. Attached new version of transform.php patch (renamed brlo to entitymap). Take care, -- /\/\ Christian Reis, Senior Engineer, Async Open Source, Brazil ~\/~ http://async.com.br/~kiko/ | [+55 16] 274 4311 |
|
From: Steve W. <sw...@pa...> - 2001-06-09 22:24:01
|
It appears the database abstraction library that ships with PHP is dbx: http://www.php.net/manual/en/ref.dbx.php The downside is dbx has to be compiled in, of course. I suppose this is the tradeoff we face: PhpWiki 1.4 could use such a database abstraction, and that makes it work with more databases; but the users have to have the abstraction layer compiled in. I wonder why the PHP maintainers don't automatically add dbx when any database supported by dbx is compiled in? The function library is extremely limited... while this means more work on the app end instead of the database, I guess it would be a win overall. I'll compile it in and try it out. ~swain --- http://www.panix.com/~swain/ "Without music to decorate it, time is just a bunch of boring production deadlines or dates by which bills must be paid." -- Frank Zappa |
|
From: Christian R. R. <ki...@as...> - 2001-06-09 22:20:05
|
On Sat, 9 Jun 2001, Sandino Araico S=E1nchez wrote: > Maybe use soundex() for string matching? > A soundexed version should be sotred in parallel. Hmmm. AFAIK soundex doesn't exist for every language, and you still need to be able to do raw string matching, for which the encoding will matter. Take care, -- /\/\ Christian Reis, Senior Engineer, Async Open Source, Brazil ~\/~ http://async.com.br/~kiko/ | [+55 16] 274 4311 |
|
From: Sandino A. <sa...@sa...> - 2001-06-09 07:29:01
|
Christian Robottom Reis wrote: > And the database, if it does string matching, additionally (using LIKE,= ~ > and whatnot). > Maybe use soundex() for string matching? A soundexed version should be sotred in parallel. -- Sandino Araico S=E1nchez Si no eres parte de la soluci=F3n, entonces eres parte del precipitado. |
|
From: Sandino A. <sa...@sa...> - 2001-06-09 07:27:32
|
Jeff Dairiki wrote: > > I suppose entitizing upon output as you suggest doesn't hurt anything, > but it still seems unnecessary to me. Entitizing has a big performance cost. An optional pre-entitized cache sh= ould be used to avoid entitizing every time the page is displayed. > > > Is anyone using PhpWiki with any other non ISO-8859-1 eight-bit charset= ? ISO-8859-15 > > (I know there have been many requests for multi-byte character support, > but that's not an easy fix. I think that pretty much requires switchi= ng > to using unicode/UTF-8 internally, and this won't be practical without > unicode support compiled into PHP and its regexp libraries.) > -- Sandino Araico S=E1nchez Si no eres parte de la soluci=F3n, entonces eres parte del precipitado. |
|
From: Sandino A. <sa...@sa...> - 2001-06-09 07:20:24
|
Christian Robottom Reis wrote: > Name: config-new > config-new Type: Plain Text (TEXT/PLAIN) > Encoding: BASE64 > > Name: transform-diff > transform-diff Type: Plain Text (TEXT/PLAIN) > Encoding: BASE64 Instead of using foreach you can pass the array as parameter to strtr(). = Look for details in the strtr() manual. -- Sandino Araico S=E1nchez Si no eres parte de la soluci=F3n, entonces eres parte del precipitado. |
|
From: Christian R. R. <ki...@as...> - 2001-06-08 22:45:17
|
(Seen today's thread here) Just let me get this properly: we only store two versions of each page, and writing from the same host (taking into account time interval?) is considered to be a change that replaces the current version? My personal experience with phpwiki indicated it's true (which means I need to start backing stuff up today :-) but I was under the impression we had levels of undo like we have in swiki. I'm using flat-file storage here, if it makes a difference. Take care, -- /\/\ Christian Reis, Senior Engineer, Async Open Source, Brazil ~\/~ http://async.com.br/~kiko/ | [+55 16] 274 4311 |
|
From: Joel U. <uck...@no...> - 2001-06-08 22:20:26
|
Quoth Steve Wainstead: [snip] > Archiving is in the to-do list > https://sourceforge.net/pm/task.php?group_project_id=7691&group_id=6121&func= > browse > > This list is really part to-do, part wish list. Archiving is def. one of > the top priorities for the next release. > > ~swain I read the to-do entry for archiving, and have a few thoughts and questions. 1. The level of archiving should be wiki-owner-configurable, obviously. The number of page versions kept should be open-ended, from zero to all. That way, users with little disk space can limit the number of versions kept, while those who want to archive everything can do so as well. 2. How does phpwiki store previous versions now? Would it be feasible simply to extend that? If not, what sort of thing do you think would work best? CVS or RCS, perhaps? Keeping a database of numbered diffs? 3. I know phpwiki is divided into the database interface code and the wiki code. It sounds like archiving would mostly affect the wiki code, and the database interface code only if something like CVS or RCS were used. Anyway, I'd appreciate hearing what you guys think would be the most fruitful approach. -- J. |
|
From: Steve W. <sw...@pa...> - 2001-06-08 21:31:51
|
Since 1.2 was released, development has slowed down with Jeff doing 99% of the work. I'm just starting to get back into it again after a month of waiting to get DSL set up... it's too painful to do anything at 28.8 anymore. Archiving is in the to-do list https://sourceforge.net/pm/task.php?group_project_id=7691&group_id=6121&func=browse This list is really part to-do, part wish list. Archiving is def. one of the top priorities for the next release. ~swain On Fri, 8 Jun 2001, Joel Uckelman wrote: > I remember a while back (December?) there being some discussion on the list > of ways to archive old wiki pages more extensively than phpwiki does it > now. Did anything ever come of that? I'm interested to know, since I I'm > currently hosting two wikis for which more complete archiving would be > useful. If nothing has been done regarding archiving, I might even be > interested in doing a little coding to bring it off. > > -- > J. > > > > _______________________________________________ > Phpwiki-talk mailing list > Php...@li... > http://lists.sourceforge.net/lists/listinfo/phpwiki-talk > --- http://www.panix.com/~swain/ "Without music to decorate it, time is just a bunch of boring production deadlines or dates by which bills must be paid." -- Frank Zappa |
|
From: Joel U. <uck...@no...> - 2001-06-08 20:24:10
|
I remember a while back (December?) there being some discussion on the list of ways to archive old wiki pages more extensively than phpwiki does it now. Did anything ever come of that? I'm interested to know, since I I'm currently hosting two wikis for which more complete archiving would be useful. If nothing has been done regarding archiving, I might even be interested in doing a little coding to bring it off. -- J. |
|
From: Jeff D. <da...@da...> - 2001-06-08 19:42:14
|
>And the database, if it does string matching, additionally (using LIKE, ~ >and whatnot). Another good point. (I suspect it will be quite awhile before PhpWiki supports unicode.) |