I'm trying to use SQLObject's lazyUpdate feature and my data is getting clobbered. Here's the class definition
lazyUpdate = True
def __exit__(self, type, value, traceback):
foo = IntCol()
bar = IntCol(default=None)
The context manager functions are so I can write something like:
with T.select().getOne() as t:
# update t
Everything seems to work fine when I'm creating the objects. However, when I'm just reading the objects, fields with default values (like bar above) are getting reset to their default values.
For example, given
rows = T.select()
return list(repr(r) for r in rows)
If I call get_rows repeatedly, I find that the first time I get the correct data and subsequent times some or all of the rows have their values replaced by their defaults. Eventually all the rows are reset to defaults. Fields without default values are not touched.
If I comment out the line that sets lazyUpdate in the class definition, it works fine (presumably with many more writes to the db). If I comment out the call to syncUpdate it still fails. Changing the default to a different value (like -1) still fails (and values get reset to -1 not None). It's not some lingering schema setting as I've deleted the table and recreated it multiple times as I've been trying to resolve the problem.
FWIW, I'm using SQLObject 1.3.1, Python 2.7 and PostreSQL 9.1.
Log in to post a comment.