From: Jon P. <jo...@cr...> - 2007-03-21 04:32:52
|
On Tue, 2007-03-20 at 21:28 -0700, Jon Phillips wrote: > On Tue, 2007-03-20 at 19:04 -0700, Mike Linksvayer wrote: > > On Tue, 2007-03-20 at 18:59 -0700, Jon Phillips wrote: > > > Yes, it seems to be working if the memory_limit php setting is high > > > enough to deal with the load. I added a ini_set for this CLI option to > > > the data dump code. It appears to be working fine on ccmixter.org. I > > > added this for openclipart.org's datadump to work. > > > > > > That is ok temporarily, but another approach should be found for > > > scalability, soon... > > > > Is the entire dump being constructed in memory before being written? Or > > the entire db resultset read into memory before the dump is constructed? > > The first one unfortunately...Yes, it needs to be paged through and > written piece by piece... I haven't looked at the new query code too much...VS, what is best way to accomplish this now with paging and new query code? Jon > > here's no reason a dump should be memory limited. If we're using a > > library that does one or both of the things above it is bogus. :-) > > Yah, agree...one of those fastest line to getting done tasks if I > remember ;) I'll add it to my list ;) > > Jon > -- Jon Phillips jo...@cr... cell: 510.499.0894 Community/Business Developer Creative Commons www.creativecommons.org |