[SQLObject] Best approach for sharing connections
SQLObject is a Python ORM.
Brought to you by:
ianbicking,
phd
From: Jorge G. <go...@ie...> - 2005-10-06 19:09:22
|
Hi, I'm writing an application where I have several different modules and a centralized login when the user connects to the GUI. Now we're implementing a Web interface and I'd like to migrate from Qt's SQL interface to SQL Object, but retain this facility. This application is used by several users -- and the user base will be four to five times bigger after we grant the web access -- simultaneously, with lots of restrictions and implementations on the database side (PostgreSQL, using rules, functions, triggers, etc.). I need that each user that connects to the system (through any interface) open a new connection and do his/her work. What I'm doing now is creating one module for the connection with the following contents: -------------------------------------------------------------------------------- import sqlobject #### Ommited for email # Get variables here, for username, hostname and password) #### conn_str = 'postgres://%s:%s@%s/neo' % (usuario, senha, host) connection = sqlobject.connectionForURI(conn_str) -------------------------------------------------------------------------------- And calling this for each user. Then I'm creating all the mappings in another python file and using this as the source for my UI data. Some doubts I have on how to get the best out of SQL Object: - Is it a good approach? - Is it better to create a new directory and separate classes inside several files? There are some particularities that could be mapped this way, but usually there's a lot of dependency of one class / subsystem with another... - Is it safe to establish one new connection only per user? Is it possible to isolate simultaneous and separate actions in concurrent transactions (e.g. an address being inserted at the database while a new client / employee is being inserted as well)? - What is the best way to handle cursors? I'll have some tables that will reach millions of records very soon and it's impossible to load them all into a dictionary / list to move from one record to the next... (I was thinking about creating a pseudo linked list, where one record also points to the prior and next records, if I couldn't use cursors... How about it?) - I can change my model to use lots of artificial keys to be in accordance with SQL Object's restriction of using numerical PKs... Is it still necessary? How about composite keys (two, three, four fields)? We're mapping some bio data, there are lots of different conditions and analisys that can / need to be done on each sample and sometimes from a table with 6, 7 required fields, one -- and not always the same -- differs from the other, so I need to index them all together as "unique", to avoid duplicates... - How's exception treatment? And error recovering? I am also willing to dedicate some time to document these better as well, since I haven't found much in the existing docs (I found some spread through the net, tutorials, presentations, etc. but I'd like to put them together for SQLObject). Be seeing you, -- Jorge Godoy <go...@ie...> |