|
From: Colin S. <co...@ow...> - 2004-10-04 23:02:16
|
Hi,
I've just started using SQLObject in a program that loads web server log
information into a MySQL database. The program consists of a tight loop
that reads a web server entry, parses the data, performs one SQL lookup,
and then inserts the data into a table.
When I use the MySQL libraries directly the program handles ~290 rec/s.
When I use SQLObject to perform the lookup and SQL insert the rate drops
to 60 rec/s.
Is it unrealistic to use SQLObject for DB interaction when handling
batch loads of data? I've done a quick profile of the code (top few
calls below) and nothing jumps out as being particularly easy to
optimise...
ncalls tottime percall cumtime percall filename:lineno(function)
15308/14042 2.500 0.000 4.430 0.000 converters.py:179(sqlrepr)
1210 2.310 0.002 3.830 0.003 main.py:755(set)
3630 2.140 0.001 3.700 0.001 cursors.py:166(__do_query)
14042 1.370 0.000 5.800 0.000 dbconnection.py:438(sqlrepr)
23244 1.240 0.000 1.240 0.000 main.py:1176(instanceName)
3419 0.830 0.000 15.820 0.005 dbconnection.py:114(_runWithConnection)
14253 0.810 0.000 0.810 0.000 converters.py:81(lookupConverter)
1210 0.800 0.001 1.510 0.001 main.py:812(_SO_selectInit)
Colin.
|