pycs-devel Mailing List for Python Community Server (Page 21)
Status: Alpha
Brought to you by:
myelin
You can subscribe to this list here.
| 2002 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
(2) |
Sep
(3) |
Oct
(1) |
Nov
(70) |
Dec
(41) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2003 |
Jan
(20) |
Feb
(9) |
Mar
(36) |
Apr
(11) |
May
(3) |
Jun
(6) |
Jul
(3) |
Aug
(13) |
Sep
(2) |
Oct
(32) |
Nov
(4) |
Dec
(7) |
| 2004 |
Jan
(14) |
Feb
(16) |
Mar
(3) |
Apr
(12) |
May
(1) |
Jun
(4) |
Jul
(13) |
Aug
(1) |
Sep
(2) |
Oct
(1) |
Nov
(2) |
Dec
(3) |
| 2005 |
Jan
(7) |
Feb
|
Mar
|
Apr
(4) |
May
|
Jun
(2) |
Jul
|
Aug
(5) |
Sep
|
Oct
|
Nov
(2) |
Dec
(1) |
| 2006 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
(1) |
Sep
(2) |
Oct
(7) |
Nov
(18) |
Dec
(22) |
| 2007 |
Jan
(10) |
Feb
(11) |
Mar
(1) |
Apr
(6) |
May
(5) |
Jun
(5) |
Jul
(14) |
Aug
(28) |
Sep
(4) |
Oct
(6) |
Nov
(9) |
Dec
(8) |
| 2008 |
Jan
(10) |
Feb
(19) |
Mar
(38) |
Apr
(17) |
May
(13) |
Jun
(7) |
Jul
(36) |
Aug
(15) |
Sep
(2) |
Oct
|
Nov
|
Dec
|
|
From: Phillip P. <ph...@my...> - 2002-11-17 07:30:29
|
Weird. They've been having trouble with the archives recently - maybe they'll come back online soon. Perhaps I should set up an alternative archive ... hmm ... perhaps a bzero blog on pycs.net like the MT one I did of the group-forming list at http://dev.myelin.co.nz/gf/ ;-) > From: "Georg Bauer" <gb...@mu...> > Just noticed that the archives for this list are not available any more > at sourceforge. Too bad it just happens when my own system goes bozo and > kills mail bodies. I do know that I received two mails from the list, > but I don't know from whom and can't read the content :-/ > > Grmbl. Don't like borken mail systems, make me always nervous ... |
|
From: Georg B. <gb...@mu...> - 2002-11-16 23:24:09
|
Hi! Just noticed that the archives for this list are not available any more at sourceforge. Too bad it just happens when my own system goes bozo and kills mail bodies. I do know that I received two mails from the list, but I don't know from whom and can't read the content :-/ Grmbl. Don't like borken mail systems, make me always nervous ... bye, Georg |
|
From: Phillip P. <ph...@my...> - 2002-11-16 22:10:36
|
Georg Bauer wrote:
>
> >>> import xmlrpclib
> >>> server = xmlrpclib.ServerProxy('http://127.0.0.1:5335/RPC2')
> >>> server.aggregator.getSubs('YYYYY','XXXXX')
[...]
> File "Festplatte:Applications:Python 2.2.1:Lib:xmlrpclib.py", line
> 390, in feed
> self._parser.Parse(data, 0)
> UnicodeError: UTF-8 decoding error: invalid data
You've probably already figured this out, but Python is complaining here
because Radio is returning invalid XML. This seems odd, because you *are*
calling the right location ('http://127.0.0.1:5335/RPC2'), and if it can't
find the function you're after, it should return an xmlrpc fault.
(If you try calling xmlrpc methods on, say, http://muensterland.org/, you'll
get the same error, because it will return you an HTML page instead of the
expected XML).
I've seen invalid XML coming out of the aggregator API functions before,
though, so it might just be something getting screwed up inside Radio.
Perhaps it's an issue with international characters ... the XML-RPC spec
says everything has to be in US ASCII format, which tends to mangle accented
chars rather badly. I'm not quite sure what's going to happen with this.
Cheers,
Phil :)
|
|
From: Phillip P. <ph...@my...> - 2002-11-16 22:04:10
|
Georg Bauer wrote:
>
> ["xmlrpc://127.0.0.1:5335/RPC2"].aggregator.getSubs("XXXXX","XXXXX")
>
> Now this tells me that getSubs can't be invoked because it isn't
> defined. A cross check against the metaWebLog.getPost function shows it
> would work, if the function was defined. But it isn't. radio.root is up
> to date. Whats that?
Have you installed the aggregator API handlers? I haven't been checking
recently, but last time I looked, they weren't being deployed through the
Radio.root update system.
The link is on the radio-dev mailing list somewhere ... right now I'm seeing
it as the most recently changed file in Jake Savin's 'gems' directory:
http://jake.userland.com/gems/?C=M&O=D
Here's a direct link:
http://jake.userland.com/gems/aggregatorApi021114.fttb
Cheers,
Phil :)
|
|
From: Georg B. <gb...@mu...> - 2002-11-16 20:11:46
|
Hi!
> On another thing: Anyboday used Mac Python to connect to xml servers? I
> just tried to fiddle around with the aggregator API, and get the
> following, regardless which function I call.
It get's weirder. I though, wait, RU has an XML/RPC implementation, try
that instead. So I made the following quickscript:
["xmlrpc://127.0.0.1:5335/RPC2"].aggregator.getSubs("XXXXX","XXXXX")
Now this tells me that getSubs can't be invoked because it isn't
defined. A cross check against the metaWebLog.getPost function shows it
would work, if the function was defined. But it isn't. radio.root is up
to date. Whats that?
Lot's of scratching head on my side ...
bye, Georg
|
|
From: Georg B. <gb...@mu...> - 2002-11-16 19:29:08
|
Hi!
> For the record, there's 2 big coffee mugs, and 1 teeny
> one. ;-)
Uhm. 1 Big one with steam coming of it's top and one with a cactus.
Those teeny ones to the right are something different. Just click on the
big one with OPML. Or use the URL Robert provided.
On another thing: Anyboday used Mac Python to connect to xml servers? I
just tried to fiddle around with the aggregator API, and get the
following, regardless which function I call.
Anybody any idea? Phillip?
>>> import xmlrpclib
>>> server = xmlrpclib.ServerProxy('http://127.0.0.1:5335/RPC2')
>>> server.aggregator.getSubs('YYYYY','XXXXX')
Traceback (most recent call last):
File "<input>", line 1, in ?
File "Festplatte:Applications:Python 2.2.1:Lib:xmlrpclib.py", line
821, in __call__
return self.__send(self.__name, args)
File "Festplatte:Applications:Python 2.2.1:Lib:xmlrpclib.py", line
975, in __request
verbose=self.__verbose
File "Festplatte:Applications:Python 2.2.1:Lib:xmlrpclib.py", line
853, in request
return self.parse_response(h.getfile())
File "Festplatte:Applications:Python 2.2.1:Lib:xmlrpclib.py", line
891, in parse_response
p.feed(response)
File "Festplatte:Applications:Python 2.2.1:Lib:xmlrpclib.py", line
390, in feed
self._parser.Parse(data, 0)
UnicodeError: UTF-8 decoding error: invalid data
>>> server.blah()
Traceback (most recent call last):
File "<input>", line 1, in ?
File "Festplatte:Applications:Python 2.2.1:Lib:xmlrpclib.py", line
821, in __call__
return self.__send(self.__name, args)
File "Festplatte:Applications:Python 2.2.1:Lib:xmlrpclib.py", line
975, in __request
verbose=self.__verbose
File "Festplatte:Applications:Python 2.2.1:Lib:xmlrpclib.py", line
853, in request
return self.parse_response(h.getfile())
File "Festplatte:Applications:Python 2.2.1:Lib:xmlrpclib.py", line
891, in parse_response
p.feed(response)
File "Festplatte:Applications:Python 2.2.1:Lib:xmlrpclib.py", line
390, in feed
self._parser.Parse(data, 0)
UnicodeError: UTF-8 decoding error: invalid data
>>>
bye, Georg
|
|
From: Robert B. <rba...@ma...> - 2002-11-16 18:41:56
|
Hello, > > Would you mind also adding the URL radio prompts me > for? This URL should get everyone to Georg's outliner via Radio's Open URL option: http://hugo.muensterland.org/instantOutliner/georgBauer.opml Regards, Robert |
|
From: Dean G. <goo...@ya...> - 2002-11-16 17:56:23
|
Really dumb newbie question ... > you can subscribe to > my instant outline > at http://hugo.muensterland.org/ - just click on the > big coffee mug. For the record, there's 2 big coffee mugs, and 1 teeny one. ;-) I'm guessing you mean the BFCM with a cactus, or the that one that says "OPML"? Would you mind also adding the URL radio prompts me for? - Dean __________________________________________________ Do you Yahoo!? Yahoo! Web Hosting - Let the expert host your site http://webhosting.yahoo.com |
|
From: Georg B. <gb...@mu...> - 2002-11-16 13:19:37
|
Hi! Just to let you know, if you are interested in what I am hacking at and what I am playing around with, you can subscribe to my instant outline at http://hugo.muensterland.org/ - just click on the big coffee mug. I will put in at least todo items and stuff I am tinkering with and ideas I have about that. Nothing really important, but might give some "look into my brain" for you. bye, Georg |
|
From: Georg B. <gb...@mu...> - 2002-11-16 12:34:21
|
Hi! >> - fetch source from http://pycs.sourceforge.net/ >> - $PREFIX/pycs/bin/python setup.py build >> - $PREFIX/pycs/bin/python setup.py install > Do you mean http://pyxml.sourceforge.net/ ? I just updated the wiki. bye, Georg |
|
From: Georg B. <gb...@mu...> - 2002-11-16 12:15:27
|
Hi! > Do you mean http://pyxml.sourceforge.net/ ? Yes. Damn. pysomethings all around me ;-) bye, Georg |
|
From: Phillip P. <ph...@my...> - 2002-11-16 11:58:27
|
> Ok, I installed PyXML. > > To install one should do: > > - fetch source from http://pycs.sourceforge.net/ > - $PREFIX/pycs/bin/python setup.py build > - $PREFIX/pycs/bin/python setup.py install Do you mean http://pyxml.sourceforge.net/ ? Alternatively, here's a pointer straight to the source URL: http://telia.dl.sourceforge.net/sourceforge/pyxml/PyXML-0.8.1.tar.gz Cheers, Phil |
|
From: Georg B. <gb...@mu...> - 2002-11-16 10:57:02
|
Hi! > neighbourhood tool), and the solution was to install PyXML, which > comes with a much quicker XML parser than the standard xmlrpclib one. Ok, I installed PyXML. To install one should do: - fetch source from http://pycs.sourceforge.net/ - $PREFIX/pycs/bin/python setup.py build - $PREFIX/pycs/bin/python setup.py install It looks like it solves the problem, I can't tell for sure because one of my servers is playing up against me and currently wins (two machines plaing heartbeat-ping-pong and _no_ trace in the logs make one really nervous ...) bye, Georg |
|
From: Georg B. <gb...@mu...> - 2002-11-16 10:20:55
|
Hi! > If it returns an instance of ExpatParser, you're OK. If it returns a > SlowParser(), see what you can install to speed that up. Aarghl. It returns SlowParser. Ok, we have to add Expat and PyXML installation to your installation document :-) > See what sort of speedup you get after installing PyXML first though, > as it might solve your problem ... Yep, that's the first thing I will check today. bye, Georg |
|
From: Georg B. <gb...@mu...> - 2002-11-16 10:18:14
|
Hi! > When installing on FreeBSD, Python 2.1.3 is known to > have a problem causing stack overflows, stopping some > pages from working. Ok, we use Python 2.2 so that should be fixed already. bye, Georg |
|
From: Phillip P. <pp...@my...> - 2002-11-16 10:00:41
|
Hi, Sorry - I just moved house, and haven't had time to check my mail until now. I've had trouble with large messages sent to saveMultipleFiles in the past (first noticed it when Rogers Cadenhead tried to upstream the very large HTML output of Dave Winer's weblog neighbourhood tool), and the solution was to install PyXML, which comes with a much quicker XML parser than the standard xmlrpclib one. To check: python import xmlrpclib xmlrpclib.getparser() If it returns an instance of ExpatParser, you're OK. If it returns a SlowParser(), see what you can install to speed that up. On the old pycs.net server, Radio was timing out sending a 1 meg file with the slow parser, and worked fine when PyXML was installed. The continued failure is a bit shocking. I've seen the python process using > 90% CPU for long periods of time in the aforementioned xml parsing timeout situation though: is this what is happening? About threads: yeah, I don't think that will be the problem, as (as someone said), Medusa doesn't use them, and PyCS doesn't either. If you want to decrease the latency of saveMultipleFiles, you can do it without adding threads -- expat lets you pass it little chunks of XML at a time, and it will call you back when stuff happens. It might be possible to hack xmlrpclib to let you run lots of things at once. See what sort of speedup you get after installing PyXML first though, as it might solve your problem ... Cheers, Phil :) On Thu, Nov 14, 2002 at 07:48:46PM +0100, Georg Bauer wrote: > I have a weird blocking/hanging of pycs after bigger upstreams. When I > upstream my complete site, I get errors in the eventlog and the server > isn't responding for some time. Restarting fixes it, but sometimes it > seems to fix itself. Don't know what's up there, but it looks like some > problem with timing and time-needed by XML/RPC-calls to me. > > Anybody has any clue as to what actually might happen inside pycs at > that moment? > > This reminds me that I wanted to ask wether the server is running > multithreaded or with serialized handling of requests. I actually never > looked into the code to determine this :-) > > If it doesn't run multithreaded (and if it isn't, this might give a clue > as to where the problem up there comes from: large saveMultipleFiles > will block the server for some time and so it can't process further > requests and that's the problem with accessing it), should we make it > running that way? This will require semaphores for the database, right? |
|
From: Dean G. <goo...@ya...> - 2002-11-16 06:33:11
|
> > There's been a community ripple effect of a > FreeBSD > > Python threads issue.. > > Hmm. Would be interesting if you dig up something on > that Via ZWiki: When installing on FreeBSD, Python 2.1.3 is known to have a problem causing stack overflows, stopping some pages from working. To fix this, you need to apply some simple patches ... http://zwiki.org/PatchPython213 . Also see: http://zwiki.org/IssueNo0226 And [ 554841 ] THREAD_STACK_SIZE for 2.1 , http://sourceforge.net/tracker/?func=detail&aid=554841&group_id=5470&atid=305470 via http://radio.weblogs.com/0106123/categories/python/2002/07/17.html ...hope that's somewhat in-line with what you were expecting. - Dean __________________________________________________ Do you Yahoo!? Yahoo! Web Hosting - Let the expert host your site http://webhosting.yahoo.com |
|
From: Georg B. <gb...@mu...> - 2002-11-15 18:04:34
|
Hi! I am replying to the list, so others can join in. > I'm quite impressed with Zope Page templates. They may be used > independant of Zope - > http://lists.zope.org/pipermail/zpt/2002-July/003490.html Yes, they are quite impressive, played a bit under Zope with them (although our systems all run with DTML still, as that is known to the programmers and customers). But the problem with them is that they pull in quite a big bunch of tools. So installation might get problematic, and I really don't know much about standalone ZPT installation (actually I couldn't find any pointers on the web). And it might just be a _bit_ too large for our purpose to render some system pages. We are not talking about big systems, but only the comments page, the rankings, weblogUpdates and referers pages :-) I was actually more looking for some simple stuff like PSP, but without the webserver. Just something where you can set up simple stuff easily and that don't put too much stress on the server. bye, Georg |
|
From: Georg B. <gb...@mu...> - 2002-11-15 14:57:36
|
Hi! Another question: Since I run a german community server, I need to translate everything into german (not everybody around here is on good terms with english). And, of course, I want to put my own design on the server (even though I currently actually don't have an own design ;-) ). What ways would be best-practice to accomplish this? For the translation I think switching to some localization library (GNU gettext? Other possibilities?) would be the right way. And for the changed design, we should implement templates. Any good pointers to localization libraries and template libraries for python that could be used without too much hassles? I had a look for templates and found some stuff, but most of it comes with it's own web server or CGI implementation and I don't think that's the right way, we already have our own module and server solution. Any hints/pointers/links/ideas/comments? bye, Georg |
|
From: Georg B. <gb...@mu...> - 2002-11-15 14:47:16
|
Hi! Two small changes made it into CVS: - ctBytesUpstreamed is now filled while upstreaming with saveMultipleFiles. Nothing important, just another 1234 gone ;-) - you can now debug a running pycs instance with the medusa monitor_client. To use the monitor_client with your installation of pycs (might be an old and well-known thing for some of us, but for those that are new to server debugging with medusa, just some little examples): ($PREFIX denotes the prefix you used in your installation process) - upgrade to the newest CVS version - check that your installation of medusa is complete and includes the monitor_client.py script in the $PREFIX/usr/lib/pycs/bin/medusa directory - add two settings to your pycs.conf: - monitorport = 8448 - monitorpassword = someSensiblePasswordNotBreakable - restart pycs Now you can connect to the debugger with: $PREFIX/bin/python $PREFIX/usr/lib/pycs/bin/medusa/monitor_client.py localhost 8448 After that you get a simple python shell (no readline support, only plain tty). The first command should be: from __main__ import * now you have all your global variables at your disposal. You can now do anything you would do if your server ran in your interactive python environment. For example (>>> denotes your entry, the rest is the result, just like in the interactive interpreter) >>> dir() ['__builtins__', 'accessLog', 'alias', 'asyncore', 'bl_h', 'daemonize', 'default_h', 'default_handler', 'filesys', 'fs', 'gid', 'hs', 'http_server', 'install_handlers', 'logger', 'mod_h', 'monitor', 'ms', 'my_pid', 'os', 'pid_file', 'port', 'pwd', 'pycsAdmin', 'pycs_auth_handler', 'pycs_block_handler', 'pycs_module_handler', 'pycs_paths', 'pycs_rewrite_handler', 'pycs_settings', 'pycs_xmlrpc_handler', 'radioCommunityServer', 're', 'rewriteFn', 'rewriteMap', 'rpc_h', 'rpc_padm_h', 'rpc_rcs_h', 'rpc_wu_h', 'rpc_xss_h', 'rw_h', 'scriptDir', 'set', 'status_handler', 'sys', 'terminate', 'time', 'uid', 'usernum', 'weblogUpdates', 'x', 'xmlFunc', 'xmlFuncRet', 'xmlStorageSystem'] >>> dir(hs) ['SERVER_IDENT', '__doc__', '__getattr__', '__init__', '__module__', '__repr__', '_fileno', 'accept', 'accepting', 'add_channel', 'addr', 'bind', 'bytes_in', 'bytes_out', 'channel_class', 'close', 'closing', 'connect', 'connected', 'create_socket', 'debug', 'del_channel', 'exceptions', 'family_and_type', 'handle_accept', 'handle_close', 'handle_connect', 'handle_error', 'handle_expt', 'handle_expt_event', 'handle_read', 'handle_read_event', 'handle_write', 'handle_write_event', 'handlers', 'install_handler', 'ip', 'listen', 'log', 'log_info', 'logger', 'port', 'readable', 'recv', 'remove_handler', 'send', 'server_name', 'server_port', 'set_reuse_addr', 'set_socket', 'socket', 'status', 'total_clients', 'total_requests', 'writable'] >>> hs.handlers [<pycs_rewrite_handler.pycs_rewrite_handler instance at 0x82708ec>, <pycs_block_handler.pycs_block_handler instance at 0x827089c>, <pycs_xmlrpc_handler.pycs_xmlrpc_handler instance at 0x8270374>, <pycs_module_handler.pycs_module_handler instance at 0x826e15c>, <Default HTTP Request Handler (66 hits) at 825c734>] "hs" is the http_server variable, everything is connected to this one. Another interesting thing is "set", the settings structure: >>> dir(set) ['AddReferrer', 'AddUpdate', 'Commit', 'DefaultConfigValue', 'DumpData', 'DumpMeta', 'DumpUsers', 'FindUser', 'FindUserByEmail', 'FormatUsernum', 'GetDate', 'GetTime', 'LongTitle', 'NewUser', 'Render', 'ServerHostname', 'ServerMailTo', 'ServerPort', 'ServerUrl', 'ShortTitle', 'User', 'UserFolder', '__doc__', '__init__', '__module__', 'aliases', 'conf', 'db', 'meta', 'readAllOptions', 'referrers', 'updates', 'users'] Or set.users, the user database object: >>> dir(set.users) ['access', 'addproperty', 'append', 'blocked', 'copy', 'counts', 'delete', 'different', 'filter', 'find', 'flatten', 'groupby', 'hash', 'indexed', 'indices', 'insert', 'intersect', 'itemsize', 'join', 'locate', 'map', 'minus', 'modify', 'ordered', 'pair', 'product', 'project', 'reduce', 'remapwith', 'remove', 'rename', 'search', 'select', 'setsize', 'sort', 'sortrev', 'structure', 'union', 'unique'] So dir gives you the content of an object and you can just access every running object like you would if it ran interactively. You can even evaluate objects and code small snippets of python code: >>> for row in set.users: ... print row.usernum ... 0000001 0000002 0000003 That's serverdebugging for me ;-) bye, Georg |
|
From: Georg B. <gb...@mu...> - 2002-11-15 14:12:28
|
Hi! Just some more input on the weird hanging problem after big saveMultipleFiles: qual name: ['xmlStorageSystem', 'saveMultipleFiles'] trying to find xmlStorageSystem XSS method ['saveMultipleFiles'] email 0000001 password XXXXXXXXXXX -> name categories/fotografie/index.html safe name: categories/fotografie/index.html saving text file -> name 2002/11/05.html safe name: 2002/11/05.html [...] resulting urls: ['http://hugo.muensterland.org/categories/fotografie/index.html' , 'http://hugo.muensterland.org/2002/11/05.html', 'http://hugo.muensterland.org/ 2002/11/06.html', 'http://hugo.muensterland.org/2002/11/07.html', [...] This is the saveMultipleFiles. Just a "publish all pages" from Radio. Then afterwards I get the following when I try to connect to the system via curl http://localhost:5445/: error: uncaptured python exception, closing channel <http_server.http_channel co nnected 127.0.0.1:4749 at 0x82b1a5c channel#: 876 requests:1> (socket.error:(32, 'Broken pipe') [/home/www-pycs/pycs/lib/python2.2/asynchat.py|handle_read|82] [ /home/www-pycs/pycs/usr/lib/pycs/bin/medusa/http_server.py|recv|398] [/home/www- pycs/pycs/lib/python2.2/asyncore.py|recv|362]) This continues on for a while. It looks to me as if the actual XML/RPC call is fullfilled, only that connects afterwards are fubar. It actually looks like it's not something that can be cured by using multiple threads to pull stuff into the background, it looks more like something stomps over something else and breaks it. I actually get it out of nowhere, too. In one instance the server was polled serveral hours every minute without problems, but then suddenly locks up and doesn't want to talk again. I set up a monitoring tool at my server to closely watch the service and there were absolutely no upstreams then. Weird. The messages seems to be of the asnycore-Module from medusa. Somehow the channel get's closed or is broken. Any ideas on this? bye, Georg |
|
From: Georg B. <gb...@mu...> - 2002-11-15 12:52:05
|
Hi! I answer to the pycs-devel list, too, you just left it out, but I think that's on accident ;-) > Medusa is not multi-threaded...but that's not to say > the other tasks aren't. I actually didn't see any instance of Threading in there, so I think that pycs as a whole isn't multithreaded if medusa itself isn't. There are some hooks that would give an opportunity to create threads and let stuff process in the background and still keeping a non-threaded medusa: - for example the transfer from handle_request to continue_request would be a natural point where you create a thread for the latter and so free up the listener - another way would be in the XML/RPC handler, where you process all incoming data, parse them, find matching function and call _that_ in a thread. The result would be returned by the thread in this way and not by the direct handler. Both methods have the problem, though, that big requests still block the system. So usually one sets up threading in servers directly at the listener level, as soon as a connection is made the work is handed of to a thread and the listener is reopened. This would call for making medusa multithreaded. Of course, one would have to watch his steps to not break too much in the process. I usually don't build on medusa myself but use the simple http server that's given in the distribution of Python, so I don't know wether there are better places in medusa to set up threading. The "builtin" http server is quite easily changed to multithreaded or multiprocessed, just some subclassing and overriding of the bind method. Another option would be pre-forked processes (that's what Zope does with medusa), but I actually don't like that approach too much. Dynamic forking or better dynamic threading is much more to my likening (even though you will have to add some stuff to prevent too much threads). A combined option built on both dynamic threads and static preforking would be the approach done by Apache 1.x: preforked processes, a scoreboard for communication and dynamic starting/stopping of processes and central dispatch. But that's a _log_ of overhead and code to write ... > There's been a community ripple effect of a FreeBSD > Python threads issue.. Hmm. Would be interesting if you dig up something on that, as that might direct us to use one model or the other, since I think keeping broad compatibility of PyCS with OSes would be a good thing [tm]. And since I only have Linux and Mac OS X as an option to check with (and currently actually only use Linux for the server), I am not very current with regards to BSD systems. > I can dig for more info/links if you think it'll help. Sure. Everything might help, if nothing else it keeps the list running ;-) bye, Georg |
|
From: Georg B. <gb...@mu...> - 2002-11-14 18:49:02
|
Hi! I have a weird blocking/hanging of pycs after bigger upstreams. When I upstream my complete site, I get errors in the eventlog and the server isn't responding for some time. Restarting fixes it, but sometimes it seems to fix itself. Don't know what's up there, but it looks like some problem with timing and time-needed by XML/RPC-calls to me. Anybody has any clue as to what actually might happen inside pycs at that moment? This reminds me that I wanted to ask wether the server is running multithreaded or with serialized handling of requests. I actually never looked into the code to determine this :-) If it doesn't run multithreaded (and if it isn't, this might give a clue as to where the problem up there comes from: large saveMultipleFiles will block the server for some time and so it can't process further requests and that's the problem with accessing it), should we make it running that way? This will require semaphores for the database, right? bye, Georg |
|
From: Phillip P. <pp...@my...> - 2002-11-14 01:39:12
|
Hi guys, It looks like my recent server move broke the 'myelin.pycs.net' alias, so nobody's coffee cups and xml icons on comment pages have been working since them. Sorry about that! I've moved them into CVS now, so every PyCS install will have its own copy of mailto.gif, tinyCoffeeCup.gif and xml.gif. This should make some pages load rather quicker ;-) To get the new files, you'll need to tell CVS to pick up new directories as well -- use 'cvs update -d' rather than 'cvs update'. Cheers, Phil |
|
From: Phillip P. <pp...@my...> - 2002-11-14 00:11:39
|
> >Also: I see that some files now expect to see pycs.css in the server > >root; to get this to go, I've added pycs.css into CVS, and changed the > >Makefile to install it into the www dir. > > Ah, yes, that's better for new installations. I definitely will have to > set up another community just so that I can throw it away and reinstall > from scratch from time to time. Currently I am developing and debugging > directly with muensterland.org :-) > > Ok, I don't have any users to speak of, yet, so it's not that big a > problem. Heh ;-) I used to do that on pycs.net, when it was hosted on a PC in my lounge, but now that it's moved I have to run everything through the CVS server before it'll go on there, so I use the old one to test stuff. I've got a 'master PyCS' directory, and a 'gbauer' one that I was using to test patches I got from you. Each one had a shell script that would blow away a temp directory, then call the installer, then start the newly installed server, and tail all the log files. That seemed to work fine. Something like this: INST=../inst if [ -f $INST/var/run/pycs/pycs.pid ]; then kill `cat $INST/var/run/pycs/pycs.pid` fi rm -rf $INST make install USER=$USER ROOT=$USER PREFIX=$INST $INST/usr/lib/pycs/bin/pycs.py tail -f $INST/var/log/pycs/*.log The trick then is to test the one in CVS after you commit some changes -- go into another directory, call 'cvs update', then run the installer and see if it works. Anyone feel like writing some unit tests? It shouldn't be too hard to test all the XML-RPC stuff (create a blog, upload some files, check that they can be retrieved through the URLs returned, delete them, check that they have actually been deleted, do some admin stuff, and so on) and then go through and request all of the system web pages and verify that they are actually returning something and that no server errors occur. Something to verify all that would be very useful ... Cheers, Phil |