Menu

Facing performance issues with PL/JSON library 1.0.4

2014-08-18
2014-11-30
  • Mratyunjaya Tiwari

    Hi,
    In my project, I have a requirement where I need to pull down huge data in JSON format from a web service. The web service needs to be called up iteratively to pull down data with in a loop. I am using UTL_HTTP package to call web service.

    The web service returns a CLOB which is being converted to JSON object with help of library. Further processing in terms of parsing JSON response [Extracting the JSON response] is done through help of library. I am storing them into a type array for each iteration.

    Finally, I am inserting the data into database for each 10000 records as a bulk insert using FOR ALL loop.

    Library seems to be performing really slow when it has to deal with huge data in terms of -
    1. Converting CLOB variable to JSON variable
    2. Extracting JSON response further

    Please let me know, if some once has come across similar situation. Please let me know, if the performance can be improvised. I suspect there are memory leaks into this excellent Library.

    Thanks,
    MT

     
  • Jonas Krogsboell

    With large amount of data, the performance is poor. The reason is the dataformat, which uses the anydata type to circumvent the circular reference restrictions. To handle large I would use Java and a proper library like gson.

     

Log in to post a comment.