Re: [Modeling-users] Modeling performance for large number of objects?
Status: Abandoned
Brought to you by:
sbigaret
From: Wolfgang K. <wol...@gm...> - 2004-12-20 13:28:31
|
Hello, and thanks for your reply. > Usually, the penalty does not interfere with the job at hand. Oops, why? Obviously, when all objects get read into memory at startup of the application server, and written back only at night, then... > One reference point you might find useful is that when I loading 3000 > objects from a database, modifying them, and then saving the changes, > on a 700MHz p3 notebook, the loading took about 40 seconds, and the > saving, 200. That's 20 times what a direct sql script would've taken. This gives me an idea, thanks. A multiplier of 20 is quite significant imho. > Of course, in both cases, writing the sql script would've taken a > *lot* longer than the difference in run time, for me. However, it's > obvious that there are cases where the runtime difference overpowers > the developer time difference... I was wondering whether Modeling would be suitable as a persistence layer for a Python application that needs to process (create - transform - store) rather important amounts of data. The question is for me whether Modeling tries to and/or whether there would be some other way to cut the hourglass-display-time to the unavoidable minimum (dependend on the database) by some kind of smart caching of objects or by maintaining some kind of pool of pre-created object instances. Best regards, Wolfgang Keller |