SourceForge has been redesigned. Learn more.


Jose Cruz
  • Jose Cruz

    Jose Cruz - 2004-11-04

    I have to do an insert of 15000 records to a table with 500 fields! Yes, its true!

    -  How IBatis.NET will respond to this?

    - Is it possible to do an insert using a list or datatable, and ibatis will call a stored procedure for each row?

    • roberto r.

      roberto r. - 2004-11-04


      iBATIS DataMapper is a data mapping layer that utilizes ADO.NET and your chosen provider to move data into and out of your app objects from a datasource. 

      If the 15000 records with 500 columns are existing database records within a SQL Server database table, there's always BCP, T-SQL, DTS to do data extraction, transformation, and loads.  Or substitute SQLLDR, PL-SQL, Oracle Warehouse Builder or MySQL LOAD, shell scripts, MydbPAL (I haven't used it though) or Informatica $$$.

      If these 15000 records are 15000 application objects that are loaded into memory already through the use of your application, then YES, you can iterate through your list/collection and call your iBATIS SqlMapper instance to execute an insert for each object. 

      This will not be performance-oriented though... iterating through 15000 objects with 500 properties each that are in a list/collection and moving data from your app domain objects through iBATIS.NET code down through ADO.NET etc...  It can be compared to using a DTS Transform Data Task that goes through one record at a time versus using a simple INSERT INTO some_table SELECT col_a, col_b, col_c FROM another_table WHERE col_a = ? with some_table's indexes and constraints disabled or turned off.


    • Jose Cruz

      Jose Cruz - 2004-11-04

      I have the records in a datatable or array. My question is: I allways have to create some class to use as a mapper for each record? Or IBatis has some method to handle an array ou datatable? Mapping each time a DataRow to an object/class and then call IBatis Mapper can be very expensive.

      • Ted Husted

        Ted Husted - 2004-11-04

        You might be able to just pass the DataRow to the Statement as the ParameterClass. A DataRow is an object, like any other :)

        I've been using DataTables internally lately, and I could give it a try later today.

        I don't remember if we support Arrays directly now, but it does sound like a good job for a ParameterMap :)


      • roberto r.

        roberto r. - 2004-11-04


        If you're using SQL Server 2000 (if not, please ignore!)...

        Since your data will be in a datatable or array, I'd take a look at these two CodeProject articles if batch insert performance is important and you don't want to do an execute insert on each datarow (you may have seen these already):

        MSDN pages:




        With the XML method, you'd have to build the XML string by iterating over each datarow (could be a pretty big string if each datarow has 500 columns)...but the ADO.NET call to insert the data would only occur once and the actual T-SQL INSERT statement would look like (select statement from the MSDN OPENXML page):

        INSERT INTO table_a
        SELECT *
        FROM   OPENXML (@idoc, '/ROOT/Customer/Order/OrderDetail',2)
                 WITH (OrderID       int         '../@OrderID',
                       CustomerID  varchar(10) '../@CustomerID',
                       OrderDate   datetime    '../@OrderDate',
                       ProdID      int         '@ProductID',
                       Qty         int         '@Quantity')

        I have no idea if this will be better in the end for you, but I can only imagine that the current 15000 records will grow in number in the future.

        Hope this helps!


      • Ted Husted

        Ted Husted - 2004-11-05

        You can use a Hashtable instead of a user-defined object, and then use the Hashtable to populate a DataTable. The underlying value objects would not be recreated, you'd just be adding another reference to them to the DataTable. So, it is not as expensive as it sounds.

        I have been using rich objects, since they are easier to test. I've also been using DataTables to transfer data out to DataGrids and to populate forms.

        Here's a generic method I use to transfer a list of rich objects to rows in a DataTable:

        protected DataTable FormatForDataTableLoop (IList list, DataTable table)
            int cols = table.Columns.Count;
            string[] id = new string[cols];
            for (int i = 0; i < cols; i++)
            id[i] = table.Columns [i].ColumnName;
        // column type and object type *must* match
            DataRow row;
            foreach (object source in list)
            row = table.NewRow ();
            for (int i = 0; i < cols; i++)
            row [i] = ObjectProbe.GetPropertyValue(source, id[i]);
            }                table.Rows.Add (row);
            return table;

        This method might also work with Hashtables instead of rich objects. If it did, then you could use a generic helper function to acquire the list of Hashtables, copy the values to rows in a DataTable, and return the result.

        You'd have to provide the DataTable, though I'm thinking that's something that could created from a factory, using something like Spring <>.

        I hope to work on this some more next week. If anyone has any ideas or interest in returning DataTables from iBATIS.NET queries, feel free to chime in.



Log in to post a comment.