|
From: Ian B. <ia...@co...> - 2004-02-05 16:39:47
|
Peter Gebauer wrote:
> Hello! If this message has been sent twice it's because I have had problems
> with my ISP's SMTP lately.
>
> I'm a big fan of SQLObject, very nicely done!
> I have two things, one question and one patch.
>
> I haven't found a way to do row or table locking for a transaction.
> Basically, I'd like to do something like
>
> conn = DBConnection.PostgresConnection('yada')
> trans = conn.transaction()
> p = Person(1, trans)
> p.lock() # table locking
> p.selectForUpdate("yada") # row locking
> ... do something that only one client may do at a time ...
> trans.commit()
>
> There are many ways to do it, but since some databases cannot lock on rows
> and databases have support for different modes I really don't know how to
> make it super general.
>
> The selectForUpdate() should work the same way as any select, except that if
> the database supports row locking it will use a select for update statement.
>
> For table locking it's a bit tricky since there are so many different modes
> that varies by database implementation.
>
> Any suggestions?
I think there's the most general interest in optimistic locking, i.e.,
rows have a timestamp, and if the timestamp has been updated since the
object was fetched/synced you get some sort of conflict exception when
you try to commit changes. This is implemented almost entirely outside
of the database, so cross-database compatibility should be easy. Though
the rest may not be exactly easy.
Anyway, it seems a lot better than table locking, and it's a bit better
than row locking, but it catches the conflict later. It's harder to do
right (on the application level) without transactions. In fact, without
transactions I think you can't do it, because you might send one update,
and the second update (which is required for consistency) could fail.
Oh well.
> The second thing is that I have made a very simple patch that allows
> database connections to use a logger if specified.
>
> import logging
> logger = logging.getLogger('test')
> hdlr = logging.FileHandler('test.log')
> formatter = logging.Formatter('%(asctime)s %(levelname)s %(message)s')
> hdlr.setFormatter(formatter)
> logger.addHandler(hdlr)
> logger.setLevel(logging.DEBUG)
>
> conn.debug = 1
> conn.logger = logger
>
> The above code will make use of Python 2.3's logging facilities.
> I'd like to split up SQLObject logging into more logging. For example,
> SQL may be considered DEBUG level while inability to connect to a database
> may be considered CRITICAL.
Yeah, that'd be cool. Is there a logging module backported to 2.2?
Ian
|