We're trying to use this to insert a high volume of data into SQLite DB. When I use this ADO.NET adapter, my application runs out of memory after inserting 195,000 rows. If I implement my own wrapper around the SQLite APIs, I have no problem inserting 1,000,000 rows. It is the same code, just switching out this ADO.NET adapter for my wrapper. Is there a limitation to how many consecutive rows you can insert before this adapter blows up?
My application grows to over 1.5 GB using this adapter.
Any insight is greatly appreciated.
It could be a memory leak. If you could post a snippet of the code containing the inserting loop, I could try nail down a memory leak.
I'm posting the code per your request. Note that insertCount will be 1,000,000 and commitCount will be either 5,000 or 10000. The idea is that we are trying to insert huge amount of data, committing every 5K or 10K. This exact same code works great when I don't use the ADO.NET adapter and I just use the SQLite APIs directly (well, with a little unsafe code).
Note also the commented out code. I tried to use the ADO Transaction classes and I received and exception on this line of code:
myCommand.Transaction = myTrans;
Someone has already posted that this fails, I just don't have that version of code yet. So to work around it, I use BEGIN and COMMIT explicitly in my command string.
public void InsertUsingADO(int insertCount, int commitCount, int timeout)
SQLiteConnection sqlConnection = new SQLiteConnection();
sqlConnection.ConnectionString = "Data Source=" + dbName;
// open connection
// drop indices on transactions table
SQLiteCommand dropIndexes = new SQLiteCommand();
dropIndexes.Connection = sqlConnection;
dropIndexes.CommandTimeout = timeout;
dropIndexes.CommandText = @"drop index IX_Transaction;";
while( insertCount > 0 )
StringBuilder insert = new StringBuilder("BEGIN; ");
for( int i=commitCount; i>0; i-- )
insert.Append("INSERT INTO Transactions (dp, tran_id, batch_id, fiscal_year, fiscal_period, entry_date, ");
insert.Append("tran_date, currency_code, book_code, tran_desc, tran_amount, source_code, tran_type) VALUES (");
insert.Append(insertCount.ToString()).Append(", "); // dp
insert.Append(insertCount.ToString()).Append(", "); // tran_id
insert.Append("1, "); // batch_id
insert.Append("2004, "); // fiscal_year
insert.Append("2, "); // fiscal_period
insert.Append("'").Append(System.DateTime.Now.ToString()).Append("', "); // entry_date
insert.Append("'").Append(System.DateTime.Now.ToString()).Append("', "); // tran_date
insert.Append("'USD', "); // currency_code
insert.Append("'Budget_Original', "); // book_code
insert.Append("'AP Journal-GLTRX0004', "); // tran_desc
insert.Append("123.45, "); // tran_amount
insert.Append("'GLBUDGET', "); // source_code
insert.Append("'Posted'); "); // tran_type
int stringSize = insert.Length;
// push the big insert string to the DB
// Start a local transaction.
// SQLiteTransaction myTrans = sqlConnection.BeginTransaction() as SQLiteTransaction;
SQLiteCommand myCommand = sqlConnection.CreateCommand();
myCommand.CommandTimeout = timeout;
// myCommand.Transaction = myTrans;
myCommand.CommandText = insert.ToString();
catch (Exception ex)
StringBuilder errorMessage = new StringBuilder();
errorMessage.Append("Failed to insert Transactions to the data source").Append(Environment.NewLine).Append(ex.Message);
throw new ApplicationException( errorMessage.ToString() );
// add indices on transactions table
SQLiteCommand createIndexes = new SQLiteCommand();
createIndexes.Connection = sqlConnection;
createIndexes.CommandTimeout = timeout;
createIndexes.CommandText = @"CREATE INDEX IX_Transaction ON Transactions(dp, tran_date);";
You're creating a lot of objects inside loops. I wouldn't do that. Remember, these object will not free the memory they occupied until the garbage collection occurred. I can suggest moving the creation of objects out of loop. I refactor your code a little bit and you can see a result in UnitTests.CommandTests.LargeInsertCommand. I ran this function and I didn't run out of memory. In fact, the memory consumption was relatively stable around 20Mb.
Thank you for looking at this. One thing that strikes me as interesting is that this is the exact same code we used when doing this same test with the SQL Server ADO.NET adapters and OLE DB adapters and did not have any issues. I am very interested to see your refactoring. However, I am very new to this area and I don't know where to find "UnitTests.CommandTests.LargeInsertCommand". Can you please help guide me to this source or just post it here.
Again, thank you very much for your time looking at this.
Well, that's why the adapter is still in beta stage.
Here is the link to refactored code:
I meant no offense by my comments. I realize this is only a Beta release and understand the constraints that come along with it. I am not picking on the software, my intentions were to discuss that with the same logic I don't fail with other adapters, while this one does. So maybe the leak was in the adapter.
Your refactoring of my code helps, I appreciate it. I am unable to get it to work to see if it solves my problem because this line of code throws and exception:
I don't have time to get the latest version of the adapter and rebuild it. I will wait until you guys build and release a new version that fixes this. For now I will assume that this works.
Thank you very much for your help. You guys are buiding a great tool here that will be invaluable to us trying to use SQLite.
Actually, I didn't mean to insult you. My comment is rather a sigh for not having enough time to chase all bugs. FYI, I've released a new version which allows to assign transactions to the command object.
I investigated this issue a little further. It appears that calling Dispose on command object will remedy the situation.
The new version works nicely to allow me to set the Transaction. Thanks for that.
Also, calling Dispose() on the command works to keep the memory usage down.