You can subscribe to this list here.
2000 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
(103) |
Jul
(105) |
Aug
(16) |
Sep
(16) |
Oct
(78) |
Nov
(36) |
Dec
(58) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2001 |
Jan
(100) |
Feb
(155) |
Mar
(84) |
Apr
(33) |
May
(22) |
Jun
(77) |
Jul
(36) |
Aug
(37) |
Sep
(183) |
Oct
(74) |
Nov
(235) |
Dec
(165) |
2002 |
Jan
(187) |
Feb
(183) |
Mar
(52) |
Apr
(10) |
May
(15) |
Jun
(19) |
Jul
(43) |
Aug
(90) |
Sep
(144) |
Oct
(144) |
Nov
(171) |
Dec
(78) |
2003 |
Jan
(113) |
Feb
(99) |
Mar
(80) |
Apr
(44) |
May
(35) |
Jun
(32) |
Jul
(34) |
Aug
(34) |
Sep
(30) |
Oct
(57) |
Nov
(97) |
Dec
(139) |
2004 |
Jan
(132) |
Feb
(223) |
Mar
(300) |
Apr
(221) |
May
(171) |
Jun
(286) |
Jul
(188) |
Aug
(107) |
Sep
(97) |
Oct
(106) |
Nov
(139) |
Dec
(125) |
2005 |
Jan
(200) |
Feb
(116) |
Mar
(68) |
Apr
(158) |
May
(70) |
Jun
(80) |
Jul
(55) |
Aug
(52) |
Sep
(92) |
Oct
(141) |
Nov
(86) |
Dec
(41) |
2006 |
Jan
(35) |
Feb
(62) |
Mar
(59) |
Apr
(52) |
May
(51) |
Jun
(61) |
Jul
(30) |
Aug
(36) |
Sep
(12) |
Oct
(4) |
Nov
(22) |
Dec
(34) |
2007 |
Jan
(49) |
Feb
(19) |
Mar
(37) |
Apr
(16) |
May
(9) |
Jun
(38) |
Jul
(17) |
Aug
(31) |
Sep
(16) |
Oct
(34) |
Nov
(4) |
Dec
(8) |
2008 |
Jan
(8) |
Feb
(16) |
Mar
(14) |
Apr
(6) |
May
(4) |
Jun
(5) |
Jul
(9) |
Aug
(36) |
Sep
(6) |
Oct
(3) |
Nov
(3) |
Dec
(3) |
2009 |
Jan
(14) |
Feb
(2) |
Mar
(7) |
Apr
(16) |
May
(2) |
Jun
(10) |
Jul
(1) |
Aug
(10) |
Sep
(11) |
Oct
(4) |
Nov
(2) |
Dec
|
2010 |
Jan
(1) |
Feb
|
Mar
(13) |
Apr
(11) |
May
(18) |
Jun
(44) |
Jul
(7) |
Aug
(2) |
Sep
(14) |
Oct
|
Nov
(6) |
Dec
|
2011 |
Jan
(2) |
Feb
(6) |
Mar
(3) |
Apr
(2) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2012 |
Jan
(11) |
Feb
(3) |
Mar
(11) |
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
(4) |
Dec
|
2013 |
Jan
|
Feb
|
Mar
|
Apr
(3) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2014 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(4) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
(8) |
Dec
(1) |
2015 |
Jan
(3) |
Feb
(2) |
Mar
|
Apr
(3) |
May
(1) |
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
(2) |
2016 |
Jan
|
Feb
(4) |
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2017 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(3) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(3) |
Jun
(1) |
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2020 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(3) |
Jun
|
Jul
(5) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2021 |
Jan
|
Feb
(4) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
(6) |
Sep
(3) |
Oct
|
Nov
|
Dec
|
2022 |
Jan
(11) |
Feb
(2) |
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2023 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
(3) |
Dec
(3) |
2024 |
Jan
(7) |
Feb
(2) |
Mar
(1) |
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2025 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
(1) |
Jun
|
Jul
(3) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: David E. W. <da...@we...> - 2003-03-27 21:59:12
|
> How large is large? Qualitatively speaking, about 50+ lines of text. More data as we get more reproduction cases. > Could the problem be browser dependent? Doesn't seem to be so far. > > There are other somewhat less serious issues, too -- when a wiki node > > has a large table in it (>200 rows) it simply freezes up the machine. > > Is this a "new-style" table or an old-style table (via the OldStyleTable > plugin)? Is there an example at a publicly accessable URL? New-style. The URL is not accessible, as it is an intranet wiki with confidential information. :( > Or can you send me some example wiki-text which will trigger the bug? Trigger the table bug? I will post a page to the master phpwiki. > (Do you really mean "freezes up the machine"? Do other > concurrent requests to the web server get hung?) No, but that Apache process goes bonkers, consuming huge amounts of CPU. PHP terminates it after 30 seconds, giving up. > Are you sure that things are happy on the MySQL front? > (Run (my)isamchk on the tables. Any filesystem errors? Enough disk > space?) myisamchk succeeds on the .MYI files. The partition has used 1.5GB and has 1.8GB free, which means it is 43% used. > > if (php_sapi_name() == 'apache' || php_sapi_name() == 'apache2filter') > > (This has been fixed in CVS.) Cool. I will continue trying to reproduce this issue in a consistent fashion to see what is breaking. -david |
From: Steve W. <sw...@pa...> - 2003-03-27 21:13:41
|
On Thu, 27 Mar 2003, David E. Weekly wrote: > When we take a large wiki node, edit it, preview it, and save it, whole > paragraphs are duplicated in the result. This is not operator error - it's > happened several times, even from cut-and-pasting from a text editor window! > I'm unclear as to what is causing this corruption, but it has rendered the > wiki all but unuseable for large documents. If you could provide a samlpe doc, that would be helpful. It's probably too large to send to the list, so you could send it to me directly and I could forward it, or else you could dump it into the demo or test sites: http://phpwiki.sourceforge.net/test/index.php/HomePage (the nightly build) http://phpwiki.sourceforge.net/demo/ (is probably closer to 1.3.4) > There are other somewhat less serious issues, too -- when a wiki node has a > large table in it (>200 rows) it simply freezes up the machine. HTML table, I take it... hmm... > PHP times > out after 30 seconds and cancels the page load. "Document History" does not > appear to work correctly for documents that have been edited a large number > of times. (It does not show the most recent changes made.) Also, our listing > of RecentChanges seems to have stopped about seven days ago. I'd wager the database has proprietary info, otherwise I'd try to get a mysqldump that one of us could set up and test.. ~swain > > ----- Original Message ----- > From: "Steve Wainstead" <sw...@pa...> > To: "David E. Weekly" <da...@we...> > Cc: <php...@li...> > Sent: Thursday, March 27, 2003 11:51 AM > Subject: Re: Serious PHP Wiki Issue > > > > > > Good heavens! Did you post to phpwiki-talk? I haven't done any development > > on PhpWiki in over a year, just doing the releases. Jeff Dairiki is the > > principle architect over the last two years; I'm sure they can pinpoint > > the problems very quickly! > > > > ~swain > > > > > > On Thu, 27 Mar 2003, David E. Weekly wrote: > > > > > Steve, > > > > > > I need your help; our company is about to give up on PHP Wiki, declaring > it > > > "too broken to use"! It has been badly corrupting large files, freezing > when > > > tables get too large, etc. Please call me at (deleted by swain) as soon > > as you > > > can. I'd love to see PHP Wiki widely implemented, but I've seen these > bugs > > > and they're bad. > > > > > > Yours, > > > David > > > > > > > > > > > > > > > > > > > > > > --- > > http://www.panix.com/~swain/ > > "Without music to decorate it, time is just a bunch of boring > > production deadlines or dates by which bills must be paid." > > -- Frank Zappa > > > > > > > --- http://www.panix.com/~swain/ "Without music to decorate it, time is just a bunch of boring production deadlines or dates by which bills must be paid." -- Frank Zappa |
From: Jeff D. <da...@da...> - 2003-03-27 21:07:19
|
> When we take a large wiki node, edit it, preview it, and save it, whole > paragraphs are duplicated in the result. This is not operator error - > it's happened several times, even from cut-and-pasting from a text > editor window! I'm unclear as to what is causing this corruption, but it > has rendered the wiki all but unuseable for large documents. How large is large? Could the problem be browser dependent? (I vaguely remember stories about some browsers having issues when the data in the textareas gets to be larger than a certain size (32K or 64K).) > There are other somewhat less serious issues, too -- when a wiki node > has a large table in it (>200 rows) it simply freezes up the machine. Is this a "new-style" table or an old-style table (via the OldStyleTable plugin)? Is there an example at a publicly accessable URL? Or can you send me some example wiki-text which will trigger the bug? (Do you really mean "freezes up the machine"? Do other concurrent requests to the web server get hung?) > "Document History" does not > appear to work correctly for documents that have been edited a large > number of times. (It does not show the most recent changes made.) Also, > our listing of RecentChanges seems to have stopped about seven days ago. Those problems sound related. I've never seen nor heard of behavior like that before. Are you sure that things are happy on the MySQL front? (Run (my)isamchk on the tables. Any filesystem errors? Enough disk space?) > Have any of you guys seen this? I haven't. > if (php_sapi_name() == 'apache' || php_sapi_name() == 'apache2filter') (This has been fixed in CVS.) > Could it be something about our Apache2 interactions that's messing > things up? I doubt it, but I don't have an alternative theory either, so who knows? |
From: David E. W. <da...@we...> - 2003-03-27 20:59:11
|
Steve, Thanks for getting back to me. Jeff and others - our company has vigorously adopted PHPWiki in part due to my strong recommendations and prior experience working with this wiki. We've encountered a number of issues with PHPWiki, however, the most serious by far being corruption of large documents. When we take a large wiki node, edit it, preview it, and save it, whole paragraphs are duplicated in the result. This is not operator error - it's happened several times, even from cut-and-pasting from a text editor window! I'm unclear as to what is causing this corruption, but it has rendered the wiki all but unuseable for large documents. There are other somewhat less serious issues, too -- when a wiki node has a large table in it (>200 rows) it simply freezes up the machine. PHP times out after 30 seconds and cancels the page load. "Document History" does not appear to work correctly for documents that have been edited a large number of times. (It does not show the most recent changes made.) Also, our listing of RecentChanges seems to have stopped about seven days ago. While these are all issues I'd like to see resolved, without the first -- namely fixing the corruption issue, we're going to have to switch to a different wiki. Have any of you guys seen this? Is this a misconfiguration? Is it a bug with an easy fix? We're using MySQL 3.23.49 as the backing store, PHP 4.3.1 with MySQL, Apache 2.0.43, and PEAR v1.50. This is using PHPWiki 1.3.4. One patch that you might want to know about (tiny) is in lib/config.php, where in order to support Apache2, this line: if (php_sapi_name() == 'apache') got changed to this line: if (php_sapi_name() == 'apache' || php_sapi_name() == 'apache2filter') Could it be something about our Apache2 interactions that's messing things up? Please help! :) Cheerio, Dave Weekly Developer, There.com ----- Original Message ----- From: "Steve Wainstead" <sw...@pa...> To: "David E. Weekly" <da...@we...> Cc: <php...@li...> Sent: Thursday, March 27, 2003 11:51 AM Subject: Re: Serious PHP Wiki Issue > > Good heavens! Did you post to phpwiki-talk? I haven't done any development > on PhpWiki in over a year, just doing the releases. Jeff Dairiki is the > principle architect over the last two years; I'm sure they can pinpoint > the problems very quickly! > > ~swain > > > On Thu, 27 Mar 2003, David E. Weekly wrote: > > > Steve, > > > > I need your help; our company is about to give up on PHP Wiki, declaring it > > "too broken to use"! It has been badly corrupting large files, freezing when > > tables get too large, etc. Please call me at (deleted by swain) as soon > as you > > can. I'd love to see PHP Wiki widely implemented, but I've seen these bugs > > and they're bad. > > > > Yours, > > David > > > > > > > > > > > > > > --- > http://www.panix.com/~swain/ > "Without music to decorate it, time is just a bunch of boring > production deadlines or dates by which bills must be paid." > -- Frank Zappa > |
From: Micki K. <mic...@co...> - 2003-03-27 20:49:56
|
Hi there! I worked with DevNull Cat, and we came up with a recursive Include version of the SiteMap plugin. The full text is below. INSTRUCTIONS and NOTES Using the 'IncludeSiteMap' plugin for phpwiki (which I tweaked together with the author of the 'SiteMap' plugin) you can have each page of the auto-generated outline appear below, on one big page. Known issues: - it has a problem with 'subpages' (WikiName/WikiOtherName). Nested IncludePages work fine. - 'MetaData' is not displayed recursively... instead, the 'actual' page's MetaData is displayed any time an Include-d page calls the EditMetaData plugin. Upcoming features: - nesting the includes INSIDE the outline (may be too ambitious). - fixing 'subpages' issues. Here's the text to add to your WikiPages: Code: <?plugin IncludeSiteMap ?> You can use Code: direction=forward/back to determine the direction and Code: page=WikiPageName to determine the 'parent' of the outline (if left blank, the parent is the current page. Currently the recursion limit is set to 8. The contents of the IncludeSiteMap plugin (to be stored in wiki/lib/plugin) --------- <?php // -*-php-*- rcs_id('$Id: SiteMap.php,v 1.5 2002/11/04 19:17:16 carstenklapp Exp $'); /** http://sourceforge.net/tracker/?func=detail&aid=537380&group_id=6121&atid=306121 Submitted By: Cuthbert Cat (cuthbertcat) This is a quick mod of BackLinks to do the job recursively. If your site is categorized correctly, and all the categories are listed in CategoryCategory, then a RecBackLinks there will produce a contents page for the entire site. The list is as deep as the recursion level. direction: Get BackLinks or forward links (links listed on the page) firstreversed: If true, get BackLinks for the first page and forward links for the rest. Only applicable when direction = 'forward'. excludeunknown: If true (default) then exclude any mentioned pages which don't exist yet. Only applicable when direction = 'forward'. */ require_once('lib/PageList.php'); class WikiPlugin_IncludeSiteMap extends WikiPlugin { function getName () { return _("IncludeSiteMap"); } function getDescription () { return sprintf(_("IncludeSiteMap: Recursively get BackLinks or links for %s"), '[pagename]'); } function getDefaultArguments() { return array('exclude' => '', 'include_self' => 0, 'noheader' => 0, 'page' => '[pagename]', 'description' => $this->getDescription(), 'reclimit' => 8, 'info' => false, 'direction' => 'back', 'firstreversed' => false, 'excludeunknown' => true ); } // info arg allows multiple columns // info=mtime,hits,summary,version,author,locked,minor // // exclude arg allows multiple pagenames // exclude=HomePage,RecentChanges function recursivelyGetBackLinks($startpage, $pagearr, $level = '*', $reclimit = '***') { static $VisitedPages = array(); $startpagename = $startpage->getName(); //trigger_error("DEBUG: recursivelyGetBackLinks( $startpagename , $level )"); if ($level == $reclimit) return $pagearr; if (in_array($startpagename, $VisitedPages)) return $pagearr; array_push($VisitedPages, $startpagename); $pagelinks = $startpage->getLinks(); while ($link = $pagelinks->next()) { $linkpagename = $link->getName(); if (($linkpagename != $startpagename) && !in_array($linkpagename, $this->ExcludedPages)) { $pagearr[$level . " [$linkpagename]"] = $link; $pagearr = $this->recursivelyGetBackLinks($link, $pagearr, $level . '*', $reclimit); } } return $pagearr; } function recursivelyGetLinks($startpage, $pagearr, $level = '*', $reclimit = '***') { static $VisitedPages = array(); $startpagename = $startpage->getName(); //trigger_error("DEBUG: recursivelyGetLinks( $startpagename , $level )"); if ($level == $reclimit) return $pagearr; if (in_array($startpagename, $VisitedPages)) return $pagearr; array_push($VisitedPages, $startpagename); $reversed = (($this->firstreversed) && ($startpagename == $this->initialpage)); //trigger_error("DEBUG: \$reversed = $reversed"); $pagelinks = $startpage->getLinks($reversed); while ($link = $pagelinks->next()) { $linkpagename = $link->getName(); if (($linkpagename != $startpagename) && !in_array($linkpagename, $this->ExcludedPages)) { if (!$this->excludeunknown || $this->dbi->isWikiPage($linkpagename)) { $pagearr[$level . " [$linkpagename]"] = $link; $pagearr = $this->recursivelyGetLinks($link, $pagearr, $level . '*', $reclimit); } } } return $pagearr; } function run($dbi, $argstr, $request) { $args = $this->getArgs($argstr, $request, false); extract($args); if (!$page) return ''; $out = ''; $exclude = $exclude ? explode(",", $exclude) : array(); if (!$include_self) $exclude[] = $page; $this->ExcludedPages = $exclude; $this->_default_limit = str_pad('', 3, '*'); if (is_numeric($reclimit)) { if ($reclimit < 0) $reclimit = 0; if ($reclimit > 10) $reclimit = 10; $limit = str_pad('', $reclimit + 2, '*'); } else { $limit = '***'; } if (! $noheader) $out .= $description ." ". sprintf(_("(max. recursion level: %d)"), $reclimit) . ":\n\n"; $pagelist = new PageList($info, $exclude); $p = $dbi->getPage($page); $pagearr = array(); if ($direction == 'back') $pagearr = $this->recursivelyGetBackLinks($p, $pagearr, "*", $limit); else { $this->dbi = $dbi; $this->initialpage = $page; $this->firstreversed = $firstreversed; $this->excludeunknown = $excludeunknown; $pagearr = $this->recursivelyGetLinks($p, $pagearr, "*", $limit); } reset($pagearr); $nothing = ""; $includepages = 'true'; $indenter = ""; $nothing = ""; while (list($key, $link) = each($pagearr)) { if ($includepages = 'true') { $a = substr_count($key, '*'); $indenter = str_pad($nothing, $a); $out .= $indenter . '<?plugin IncludePage page=' . $link->getName() . " ?>" . "\n"; } else { $out .= $key . "\n"; } } return TransformText($out); $nothing = ""; $includepages = 'true'; $indenter = ""; $nothing = ""; } }; -- Micki Kaufman 245 8th Avenue, #188 New York, NY 10011 917 450-9137 (c) mailto:mic...@co... |
From: David E. W. <da...@we...> - 2003-03-27 20:42:17
|
Steve, Thanks for getting back to me. Jeff and others - our company has vigorously adopted PHPWiki in part due to my strong recommendations and prior experience working with this wiki. We've encountered a number of issues with PHPWiki, however, the most serious by far being corruption of large documents. When we take a large wiki node, edit it, preview it, and save it, whole paragraphs are duplicated in the result. This is not operator error - it's happened several times, even from cut-and-pasting from a text editor window! I'm unclear as to what is causing this corruption, but it has rendered the wiki all but unuseable for large documents. There are other somewhat less serious issues, too -- when a wiki node has a large table in it (>200 rows) it simply freezes up the machine. PHP times out after 30 seconds and cancels the page load. "Document History" does not appear to work correctly for documents that have been edited a large number of times. (It does not show the most recent changes made.) Also, our listing of RecentChanges seems to have stopped about seven days ago. While these are all issues I'd like to see resolved, without the first -- namely fixing the corruption issue, we're going to have to switch to a different wiki. Have any of you guys seen this? Is this a misconfiguration? Is it a bug with an easy fix? We're using MySQL 3.23.49 as the backing store, PHP 4.3.1 with MySQL, Apache 2.0.43, and PEAR v1.50. This is using PHPWiki 1.3.4. One patch that you might want to know about (tiny) is in lib/config.php, where in order to support Apache2, this line: if (php_sapi_name() == 'apache') got changed to this line: if (php_sapi_name() == 'apache' || php_sapi_name() == 'apache2filter') Could it be something about our Apache2 interactions that's messing things up? Please help! :) Cheerio, Dave Weekly Developer, There.com ----- Original Message ----- From: "Steve Wainstead" <sw...@pa...> To: "David E. Weekly" <da...@we...> Cc: <php...@li...> Sent: Thursday, March 27, 2003 11:51 AM Subject: Re: Serious PHP Wiki Issue > > Good heavens! Did you post to phpwiki-talk? I haven't done any development > on PhpWiki in over a year, just doing the releases. Jeff Dairiki is the > principle architect over the last two years; I'm sure they can pinpoint > the problems very quickly! > > ~swain > > > On Thu, 27 Mar 2003, David E. Weekly wrote: > > > Steve, > > > > I need your help; our company is about to give up on PHP Wiki, declaring it > > "too broken to use"! It has been badly corrupting large files, freezing when > > tables get too large, etc. Please call me at (deleted by swain) as soon > as you > > can. I'd love to see PHP Wiki widely implemented, but I've seen these bugs > > and they're bad. > > > > Yours, > > David > > > > > > > > > > > > > > --- > http://www.panix.com/~swain/ > "Without music to decorate it, time is just a bunch of boring > production deadlines or dates by which bills must be paid." > -- Frank Zappa > |
From: Steve W. <sw...@pa...> - 2003-03-27 19:51:21
|
Good heavens! Did you post to phpwiki-talk? I haven't done any development on PhpWiki in over a year, just doing the releases. Jeff Dairiki is the principle architect over the last two years; I'm sure they can pinpoint the problems very quickly! ~swain On Thu, 27 Mar 2003, David E. Weekly wrote: > Steve, > > I need your help; our company is about to give up on PHP Wiki, declaring it > "too broken to use"! It has been badly corrupting large files, freezing when > tables get too large, etc. Please call me at (deleted by swain) as soon as you > can. I'd love to see PHP Wiki widely implemented, but I've seen these bugs > and they're bad. > > Yours, > David > > > > > > --- http://www.panix.com/~swain/ "Without music to decorate it, time is just a bunch of boring production deadlines or dates by which bills must be paid." -- Frank Zappa |
From: Jeff D. <da...@da...> - 2003-03-27 17:23:38
|
> Am I doing something wrong, or won't version 1.3.3 support this? Sorry to have gotten your hopes up, Sandy. Upon inspection of the 1.3.3 code, it doesn't support that. |
From: Marc T. <to...@un...> - 2003-03-27 16:28:41
|
Hello Is it possible to run (force) IMAP login on SSL ? Marc. |
From: Sandy M. <mat...@bt...> - 2003-03-27 13:59:28
|
> Yes, probably. Something like [<url to image>|<link url>] should work. > E.g. > [http://www.path.to/some/image.gif | http://www.somewhere.com/link/] > > You can also make wiki-links this way: > [http://www.path.to/some/image.gif | HomePage ] > > I think, but I'm not positive, that the above "ImageLink"s will work > in 1.3.3. They will work with the latest CVS code (I just tested, > found and fixed a bug having to do with this.) I have tried both of these and to no avail. A test page is at: http://diocese.dsvr.co.uk/wiki/index.php/TestImages where I have tried the various combinations of Image and Link URL. Am I doing something wrong, or won't version 1.3.3 support this? Sandy |
From: <mal...@cs...> - 2003-03-27 00:39:12
|
On Wed, Mar 26, 2003 at 05:52:20PM -0600, Bob Apthorpe wrote: > Hi, > > On Thu, 27 Mar 2003, Malcolm Ross Kinsella Ryan wrote: > > > How much more work is planned before 1.4 can be released? > > Or at least a 1.3.5 (or 1.3.5a_rc1 ...) Actually, 1.3.5 is exactly what I _don't_ want. I want a release branch which is stable. That means that the only updates are bug-fixes. 1.3 is a development branch, which means that while bugs are fixed, new features are constantly being added alongside the fixes, resulting in a net increase not decrease in bugs. This is a natural part of the development process, and will not go away unless you focus on producing a stable release. Sorry, I don't mean to be whingy or belligerent. I just think you guys have a quality product to offer here, but you are shooting yourselves in your collective feet by making it too hard for non-developers to use. Malcolm -- Malcolm Ryan - mal...@cs... - http://www.cse.unsw.edu.au/~malcolmr/ "Blessed are the pure in heart, for they will see God." -- Matt 5:8 |
From: Bob A. <apt...@cy...> - 2003-03-26 23:52:24
|
Hi, On Thu, 27 Mar 2003, Malcolm Ross Kinsella Ryan wrote: > How much more work is planned before 1.4 can be released? Or at least a 1.3.5 (or 1.3.5a_rc1 ...) -- Bob |
From: <mal...@cs...> - 2003-03-26 23:45:27
|
Can you guys please make a stable release branch soon? The last stable release (1.2) has been left far behind. The releases in the 1.3 branch are buggy, and while these bugs are all "fixed in the nightly tarball", I am reluctant to use such bleeding-edge code. How much more work is planned before 1.4 can be released? Malcolm -- Malcolm Ryan - mal...@cs... - http://www.cse.unsw.edu.au/~malcolmr/ "Blessed are the pure in heart, for they will see God." -- Matt 5:8 |
From: Marc T. <to...@un...> - 2003-03-26 20:27:31
|
Jeff Dairiki writes: ---| > 2) I've selected the fr locale, and put ---| > if (!defined('WIKI_PGSRC')) define('WIKI_PGSRC', 'locale/fr/pgsrc'); ---| > ---| > But after the initialisation, I have no HomePage since in french the ---| > HomePage is called Accueil. ---| ---| Are you saying you have no page named Accueil? You should, I think. I have a page named Accueil, but the logo is linked to the HomePage's url (and the index.php also tries to load HomePage). I think there should be a constant HOMEPAGE_NAME or whatever that points to the localized name for HomePage no ? Note that "loading a new wiki" starts with ignoring all english pages and then creates french ones. ---| ---| > 3) I also have php warnings in a frame at the end of the page. Is it ---| > interesting to send warning to the mailing lost ? Is it possible to ---| > avoid them ? ---| ---| Send them to the list or to me. ---| ---| But before you do that, I'd suggest you update to the latest (CVS) ---| version of the code. Lot's of those sorts of bugs have been fixed ---| since 1.3.4. ---| ---| If you don't want to deal with CVS, there's a nightly snapshot ---| of the CVS code at: ---| http://phpwiki.sf.net/nightly/phpwiki.nightly.tar.gz ---| ---| Jeff Ok, with the nightly build (great job) there are less warnings. Only some warnings about the interwiki map loading. BTW, ./locale/fr/pgsrc/SteveWainstead has edit conflicts. What does it mean? Marc. |
From: Jeff D. <da...@da...> - 2003-03-26 19:41:50
|
On Wed, 26 Mar 2003 16:45:47 -0000 "Sandy Matheson" <mat...@bt...> wrote: > Is it possible to attach a hyperlink to an image within PphWiki? > > I am using version 1.3.3 Yes, probably. Something like [<url to image>|<link url>] should work. E.g. [http://www.path.to/some/image.gif | http://www.somewhere.com/link/] You can also make wiki-links this way: [http://www.path.to/some/image.gif | HomePage ] I think, but I'm not positive, that the above "ImageLink"s will work in 1.3.3. They will work with the latest CVS code (I just tested, found and fixed a bug having to do with this.) (You could also use the RawHtml plugin --- but I don't think that's in 1.3.3.) |
From: Sandy M. <mat...@bt...> - 2003-03-26 16:45:43
|
Is it possible to attach a hyperlink to an image within PphWiki? I am using version 1.3.3 Thanks for a great program, we have just started to roll it out for seven different CofE Parish Websites within our Diocese in the UK. Sandy Matheson |
From: Marc T. <to...@un...> - 2003-03-26 12:35:44
|
hello is it possible to connect to pgsql without tcp socket. ? (phpwiki 1.3.4) Marc. |
From: Jeff D. <da...@da...> - 2003-03-25 21:22:12
|
Oliver kindly sent me a copy of his wiki_pagedb.gdbm, and with that I easily reproduced his problem. The problem was not empty pagenames but integral valued pagenames. Pagenames like '1' were being stored in the dba link table as the integer one, rather than the string '1'. This was compounded by the fact, that some of the entries in the link table were bogus. I suspect this was because the last time the page was edited was probably under an older (1.3.4 or before) version of PhpWiki code which had broken link extraction code. (Re-editing the pages with bogus links caused the bogus links to go away.) I've just committed fixes to CVS. The most important one being this one: http://cvs.sourceforge.net/cgi-bin/viewcvs.cgi/phpwiki/phpwiki/lib/CachedMarkup.php.diff?r1=1.6&r2=1.7 If you're seeing this problem, after you update to latest CVS code, you probably need to rebuild the link database. The easiest way to rebuild the link database is to, make a zip-dump of your wiki and then re-initialize a new wikidb from the zip dump. 1. Make a zip dump. 2. Copy the zip dump to the server somewhere. 3. Point WIKI_PGSRC (in index.php) at the zip dump. 4. Browse to http://path.to/your/wiki/index.php?overwrite=1 (The overwrite=1 is a hack^H^H^H^Htrick to avoid the endless "file has edit conflicts" messages...) |
From: Jeff D. <da...@da...> - 2003-03-25 02:03:28
|
> "lib/WikiDB.php(In template 'browse'?)(In template 'body'?)(In > template 'html'?):167: Fatal[0]: <br />/mywiki/lib/WikiDB.php:167: : > Assertion failed <br />" > > While 1.3.4 doesn't even show the text, only this error: > "lib/WikiDB.php:160: Fatal[0]: <br />/mywiki/lib/WikiDB.php:160: : > Assertion failed <br />" > > In both cases, there is: > > function getPage($pagename) { > assert(is_string($pagename) && $pagename); > return new WikiDB_Page($this, $pagename); > } > > With a fresh created Wiki, WantedPages worked. I can't see why. > > Any hints? I'm not sure, but my guess is that there's a record in the page table with pagename = '' (the empty string). Try using the 'Exorcise WikiDB' button on the bottom of PhpWikiAdministration (it's only in the CVS version, not 1.3.4). That may or may not help. Report back in either case. What backend are you using? |
From: Oliver B. <ob...@de...> - 2003-03-25 00:19:14
|
Hello All, in PhpWiki 1.3.4 and the nightly CVS version (2003-03-18) WantedPages didn't work on an existing data base. The nightly CVS version shows the introductory text of WantedPages and this error: "lib/WikiDB.php(In template 'browse'?)(In template 'body'?)(In template 'html'?):167: Fatal[0]: <br />/mywiki/lib/WikiDB.php:167: : Assertion failed <br />" While 1.3.4 doesn't even show the text, only this error: "lib/WikiDB.php:160: Fatal[0]: <br />/mywiki/lib/WikiDB.php:160: : Assertion failed <br />" In both cases, there is: function getPage($pagename) { assert(is_string($pagename) && $pagename); return new WikiDB_Page($this, $pagename); } With a fresh created Wiki, WantedPages worked. I can't see why. Any hints? Oliver -- Oliver Betz, Muenchen |
From: Colleen D. <pla...@pr...> - 2003-03-23 18:01:25
|
just installed phpWiki 1.3.4 with MySQL, as per the INSTALL.mysql file. Created the db and used the schema to create the empty tables. created mysql user phpwiki with the required GRANTS as per the doc. So far so good. Here is what I get when I try to open the page on localhost: lib/WikiDB/backend/PearDB.php:681: Fatal[256]: wikidb_backend_mysql: fatal database error * DB Error: syntax error * (-1440 [nativecode=1064 ** You have an error in your SQL syntax near '-1440' at line 1]) * *Warning*: Unknown(): A session is active. You cannot change the session module's ini settings at this time. in *Unknown* on line *0* |
From: Jean-Philippe G. <jpg...@ou...> - 2003-03-22 18:03:45
|
Hi, I want to use PhpWiki with another language. I configure PhpWiki with index.php but the HomePage stay in english. in lib/config.php I have to use somethnig like : if (!defined('HOME_PAGE')) define('HOME_PAGE', _("HomePageLocalized")); Is there a more easy way to do this ? -- Jean-Philippe Georget jpg...@ou... - http://jpgeorget.ouvaton.org/ |
From: Peter B. <ss...@ho...> - 2003-03-22 04:00:30
|
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- Upp till kamp mot svenska statens spelmonopol! -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- Hej php...@li..., Visste du att svenska staten tjänar mer än 4500 miljoner kr varje år på sitt spelmonopol? Allt detta går rakt in i statens ficka! På senare tid har det dykt upp en del alternativa aktörer som erbjuder spel och vadhållning via Internet. Fördelen för dig som spelare blir att du får högre vinster, bättre odds och äkta spelglädje! Idag tipsar vi om nyöppnade: Casino Stockholm -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- Casino Stockholm http://www.mallwiz.com%40%48%6F%6D%65%2E%65%61%72%74%68%6C%69%6E%6B%2E%6E%65%74%2F%7E%6B%61%66%66%65%6B%61%6B%61%73%2F%63%73/ - Sveriges första riktiga Internet casino! -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- * Spela online dygnet runt. * 58 olika spel - bl.a. Blackjack, Poker, Roulette, Enarmade Banditer, Tärning, Sic Bo, Keno och spelautomater. * OTROLIGA Progressiva Jackpots - Vinn 2 miljoner på spelautomaterna! * Chatta med andra spelare från hela Sverige och resten av världen. * Spela på skoj med låtsaspengar - träna upp dina färdigheter. Öppningserbjudande: Få $75 i *RIKTIGA* spelmarker i bonus! -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- Casino Stockholm http://www.mallwiz.com%40%48%6F%6D%65%2E%65%61%72%74%68%6C%69%6E%6B%2E%6E%65%74%2F%7E%6B%61%66%66%65%6B%61%6B%61%73%2F%63%73/ - Sveriges första riktiga Internet casino! -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- ===================================================================== Detta är ett ideellt utskick från organisationen Stoppa Svenska SpelMonopolet (SSSM). Vi försöker främja en allsidig debatt och tipsar då och då om alternativa spelmöjligheter. Om du inte önskar få information av oss i fortsättningen ber vi dig skicka ett email till sss...@sp... och ange "Remove" i ämnesraden. Nästa vecka recenserar vi: Unibet Detta meddelande är en sändning till allmänheten och är därmed grundlagsskyddat enligt YGL. (Jfr SOU 1997:49 sid 170 ff.) Ansvarig utgivare: Peter Backman ===================================================================== |
From: Marc T. <to...@un...> - 2003-03-21 17:15:55
|
Hello, I've tried to do this 1- put a .htaccess to restrict access in phpwiki directory 2- switch to postgresql 3- Because the connection requires a password, I change a little bit the connection in lib/pgsql.php $connectstring = $pg_dbhost?"host=$pg_dbhost ":""; $connectstring .= $pg_dbport?"port=$pg_dbport ":""; $connectstring .= $WikiDataBase?"dbname=$WikiDataBase ":""; $connectstring .= $WikiDataBase?"password=xxxx":""; (I should add a variable in the general conf file...) The connexion is successful. Problems: 1) I obtain standard wiki pages, but it seems that I am not able to edit pages. 2) I cannot access admin.php (It loop forever and tries to open admin.php either with the admin passwd or with the login defined in .htaccess) Any idea ? Marc. |
From: Danil <ul...@ru...> - 2003-03-20 19:13:47
|
I've installed phpwiki on my mySQL ;-) server and it looks quite slow. How I can increase speed? Probably, non-SQL version is better? Thanks, Danil mailto:ul...@ru... |