On Fri, 28 Sep 2001, Steve Alberty wrote:
> i have some trouble with the dump feature.
> Have anybody test it with tables greater than 15 MB and
> a slow server (350 PII)? It is impossible to get a dump
> from big table with the current version, but the 'old'
> phpMyAdmin 2.2.0 handle these tables easy.
Yes, I have tested it. Have a look at Bug #448223, "Dump Hangs".
The limitations are either that PHP times out, PHP runs out of memory, or
that PHP takes so long the browser times out.
We will not be able to escape the browser timing out tho.
> I think we should turn back to the old method.
> What you think about it?
As Loic mentioned in his reply to this message, we need the new message
for the compressed output and the output buffering. If we can figure out a
way to make those send packets of data while we are building the dump,
then we can eliminate this problem, but I do not know if that possible. I
think it could be done with gzip (due to nature of gzip), and output
buffering (ob_flush ?), but I am not certain of bzip2 compression (which
never seemed to work on my PHP anyway, something with the libraries).
Robin Hugh Johnson
QTOD: "I used to be an idealist, but I got mugged by reality."
E-Mail : robbat2@...
ICQ# : 30269588 or 41961639
Home Page : http://www.orbis-terrarum.net
Time Zone : Pacific Daylight (GMT - 8)
-----BEGIN GEEK CODE-----
GCS/M/IT d-(+) s+:- a--- C++++
U++++ L++++ P--(+) W++ K++ PS+
N++ w--- O E---- M-(+) V-- Y++
PE++ PGP++ t-- 5 X+ R tv- b+++
D++ G++ e(*) h! r-- !y+
------END GEEK CODE------
5447C73A 30FB144C 89521B69 2D6A615E 7E20DFA1