From: John C. <jcu...@ho...> - 2001-02-04 23:58:22
|
Thomas, I just fired up the system and so I have not retrieved an inventory of items yet, but if it puts them into a large html table then you could get a count first and the set the size of the table first. The will speed up the loading of the page if the browser knows the size of the table first (it fills at it goes instead of waiting for the complete download and then putting the table on the screen) Or you could set up a cursor to fetch 20 at a time. I don't have these solutions directly in front of me and am doing from memory on some inventory issues from a previous job. There are examples in PHP postgres on how to do this and there is some CPAN stuff for postgres on how to do this. I'll look around and see what I can come up with. John C. Thomas Sawyer wrote: > Hi again... > > The bigger thing is that I just loaded up the database with 25,000+ inventory > records! Postgresql handled it nicely, loading them from an .sql file > of insert staements in about 5 minutes. > > But when I went to my windows client machine and did an inventory search > on all the items it took nearly 30 minutes to load the page! It worked > though so that's good! The slow down apparently comes from building such > a large page --not so much the transfer via the network. CGI Perl isn't > all that fast. Is there any way to speed this up? > > One thing that happened during the course of this load is that Windows > 2000 reported that it needed to increase Virtual memory to complete the > operation I bet it did! That's one big .html file! > > I have to get this working with this a large inventory. So I'm looking > at adding some selection criteria limitations to the ic.cgi script. I'm > thinking of adding limiting criteria where one has to select the first > and second letter of the search. (i.e. Aa, Ab, Ac...Zx, Zy, Zz) How does > this sound to you all? Any other ideas? |