Jose Galvez wrote on 01/18/2006 12:16 PM:
>Ben Parker wrote:
>>This is not a strict Webware question, more of a Python and Linux
>>question, but I'm hoping someone may have some good insight into a fix.
>>I have a web app that relies on data from a 3rd party, fetched via XML
>>over HTTPS. I'm using urllib2 to connect to the remote server.
>Are you keeping the old data between connections? or do you fetch the
>data every time? More importantly how often are actually connecting to
>this 3rd party server to get data? i have a similar application but I
>only update my cache of the data on daily.
I'm doing lots of caching wherever possible, but most of the data must
be fetched in real-time. It is for hotel room availability, which
>>Unfortunately, the 3rd party is unstable, and when they have a problem,
>>sometimes the appserver crashes with this error:
>>Traceback (most recent call last):
>> File "WebKit/ThreadedAppServer.py", line 630, in run
>> File "WebKit/ThreadedAppServer.py", line 156, in mainloop
>> File "/usr/local/lib/python2.4/socket.py", line 169, in accept
>>error: (24, 'Too many open files')
>If you're keeping the old data then you can simply wrap your 3rd party
>call into a try -except statement and use the old data if your update fails.
I have that already. I think the problem is that the sockets are not
timing out or throwing any error, so the except block never triggers.
Otherwise I'm already handling a few differnt kinds of socket errors.
I've been adding except clauses whenever I discover a new type of error:
These are trapped for urllib2.urlopen calls:
AttributeError (thrown from somewhere in urllib2.urlopen when it gets a
These are trapped during the read of the url object returned by
AssertionError (again thrown somewhere inside urllib2
Regards - Ben