Memory leak with SSCursor (even when use_unicde=False)
MySQL database connector for Python programming
Brought to you by:
adustman
conn = MySQLdb.connect(..., use_unicode=False, cursorclass=MySQLdb.cursors.SSCursor) cur = conn.cursor() cur.execute('select * from reallybigtable;') print cur.fetchone() cur.close() conn.close()
The row is printed immediately, but rather than the program ending, it hangs, and python's memory goes up and up (it quickly grows above 1 GB). Sending a keyboard interrupt does nothing - I have to kill the process.
As far as I know, this is a separate issue from the memory leaks described in [#319] and [#265].
It seems I can't edit my ticket? Weird. Anyways, I should have mentioned, I'm using version 1.2.3c1.
Seems like when you are closing the cursor, it is quickly going through all the resultsets, aaaand (@see BaseCursor.nextset()), if it was an execute call, it calls self.fetchall()...
This fetchall just calls to the underlying layer wo batch size being specified.. My educated guess is that this causes the memo limit, as the native lib just tries to return everything in one shot....
Btw, why the hell do we want to iterate through all the rest, dnload to client, and throw it away? Is this a mysql limitation? Extremely frustrating...