|
From: Jeff D. <da...@da...> - 2000-07-05 23:47:31
|
In message <Pin...@bo...>,Steve Wa
instead writes:
>
>OK, here's my first pass at an administrative module for PhpWiki.
>
>I made a new subdirectory, admin/, which has three files in it. One is
>index.php3, which will work much like the main index.php3: it opens the
>database and goes through an if/elseif/elseif/else block to decide which
>file to load.
Some comments:
First point:
Why is this in it's own subdirectory? Why not just an admin.php3 in the
main directory?
Which leads to (quoted from admin/index.php3):
// temporarily go up to the main directory. is there a way around this?
chdir("..");
include "wiki_config.php3";
include "wiki_stdlib.php3";
chdir("admin");
I think all the files which get included or required should be moved into
a 'lib' subdirectory. This is mostly a security issue as it makes it much
easier to prevent people from directly browsing eg.
http://blah/wiki_display.php3.
(Not that this necessarily does anything bad, but there's no reason for
that to be a valid URL at all.)
I suggest that only index.php3 and admin.php3 should be in the top level
directory. These will 'include "lib/wiki_config.php3"' (or maybe 'include
"wikilib/config.php3"'?)
(PHP, as you probably know, does support an include search path via the
configuration variable php_include_path. When PHP is run as an Apache
module, this path can be set in the local .htaccess file. With other
servers, this is probably not so easy. One could write ones version of
include
using file_exists().)
Second point:
When apache is configured to do external authorization, the variables
$PHP_AUTH_USER and $PHP_AUTH_PW never get set. The solution, in this
case, is just to delete the authorization stuff from admin/index.php3,
since the httpd is handling this anyway.
This is confusing though (for admins setting up a phpwiki, that is).
To maintain maximum plug-and-playness, it might be better to implement
authentification entirely within php. The drawback to this is, as always,
added complication: it probably requires cookies and some sort of session
management.
>The files it will choose from will be:
>
>* serialize all pages
>* dump all pages as HTML
>* load a set of serialized pages
Files and directories which are writeable by through httpd make me nervous,
and I try to minimize their number. (Of course the main databases need to
be writeable, so maybe my fear in moot.)
Mostly because of this, I, personally, favor using perl scripts to do the
dumping sorts of things.
Another slick alternative might be a PHP script which creates a tar- (or zip-)
file dump of the wiki on the fly (to be saved on the web-clients, rather than
the
web-servers disk.)
>* rebuild the DB files (for DBM-based Wikis)
>Third is a Perl script that reduces the size of a DBM file. I will write
>all of it in PHP later but wanted to prove I was right about how DBM files
>lose memory first, and I was... for the savvy sysadmin a Perl script will
>be faster or more flexible a solution (and can be easily cron'd.)
>
>The Perl script shrank the DBM file on wcsb.org from 2,464,640 bytes to
>117,574 (there are 91 pages in it).
Maybe wiki_dbmlib can do this automatically every once in awhile?
Jeff
|