This item is created just to collect things that are
requested from export feature. I currently found RFEs
#517562, #632237 and #5390201. I really didn't read
whole huge texts there, but IMHO wanted things are just:
- save exported files on server
- split export to parts (also for above)
Logged In: YES
user_id=192186
- do not store dump in memory when not needed (#726834)
Logged In: YES
user_id=587972
Feature Request #539020 (misquoted above as 5390201)
contains many similar requests to this one and offers some
preliminary code for how this might be accomplished.
I very much agree with the principle of this request, though it
might be redundant.
Logged In: YES
user_id=192186
I know about it, I just wanted to merge all export things I
want work on and because BTS here doesn't allow easy
merging, I just created new one and closed all others...
Michal
Logged In: YES
user_id=192186
Support for saving dumps on server has been just implemented.
Logged In: YES
user_id=587972
Sounds great - implemented in 2.5.0? I don't see it in the
changelog.
Additionally, from issue #632237, there were some details I
think still ought to be considered. Two sample paragraphs:
-----
You should add a pull down menu where you can select
a "stepsize" from 500 up to 5000 query steps. For
example I select a 3000 Step and press export. My example
DB has approx after an one-file export and *.sql file with 30
querys which creates my tables and 60.000 INSERT querys.
All in all nearly 60.000 Querys. 60.000 / 3000 = 20 "steps".
So what means "steps". The export process have to count all
written Querys. (a function should count all tables and rows
before the _new_ export starts. So the export file knows that
a separated export with the given steps is what the user
want.) Ok, a function tells another function that we now know
that a separated export is required, so be it. The export now
counts all querys and take a break after the 3000’s query,
close the file and create a new one, which starts from 3001
until 6000. And so on until the last data is written to the *.sql
file.
For example, when PMA has 20 parts of a *.sql import, PMA
should open a javascript sub-window which tells the
user "Attention", that is an Import which has some steps,
please be patient until the process is finished (you can add
here a number like part 1 / 20 or a faked process bar with 20
steps or something else). The popup will close itself at the
end of the whole import, so take a cup of coffee of do
something else until the import is finished. The popup should
close itself at the end and call the normal window which
contains the PMA result page.
----
I think those are both good ideas (a way to have an export go
in steps, and a javascript window to help with the process),
and I hope they will still be considered.
-Steve
Logged In: YES
user_id=192186
Not in 2.5.0, currently only in CVS...
The export in steps is planned, I will work on it as I will
have time for it...
Logged In: YES
user_id=192186
The only not implemented thing for now are partial exports.
Logged In: NO
And split imports?
Logged In: YES
user_id=192186
You have to split imported files on your local host before
transfering to server if it is needed, there is no need to
specially support this.
Logged In: NO
Hello,
I'm not sure about this.
I've dumps of over 150 megs, I failed to import those dumps.
Even dumps of about 2 mb can't be imported.
This tool: http://www.ipbsupport.com/index.php?
p=downloads&fileid=9 processes batches.
It took about 3 hours to do.
Splitting it locally is an idea, though that would mean that I
have to make at least 75 splits. Triple that as I've merged 3
of them :) And still growing.
The problem is also that my texteditors/system simply refuse
to open such large txt documents.
It would be nice if PHPMyAdmin could do the same thing as
above tool.
TosaInu (aka nobody)
Logged In: YES
user_id=192186
I close this in favour of RFE#1381854, this is more a relict
from history.