#3228 (ok 3.4.0-beta4) bug in csv import functionality

3.3.9
fixed
1
2013-06-11
2011-03-01
Anonymous
No

Hi,
I have come across what may potentially be a bug in the csv import functionality of the phpmyadmin version 3.3.9. I have a tab delimited file. No major problems there, I just fill in the proper values in the import section and all should be well. The problem is that even with the proper field termination values input, the import script is failing due to the fact that it is not picking up any of the field names even though they do indeed exist. I have tested the file with another import tool and it imports fine so I am certain that this error is specific to the phpmyadmin installation. Any thoughts on this or has anyone else encountered it.
Here is the error message.
SQL query:
CREATE TABLE IF NOT EXISTS `boneill_data`.`TABLE 1` ( ) ENGINE = MYISAM DEFAULT CHARACTER SET utf8 COLLATE utf8_general_ci;
MySQL said:
Documentation #1064 - You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ') ENGINE=MyISAM DEFAULT CHARACTER SET utf8 COLLATE utf8_general_ci' at line 1

You can see the problem, there are no field names between the brackets as phpmyadmin is not detecting them. Any thoughts are appreciated.

I have further confirmed this problem by saving the text file as an open document spreadsheet. I then tried to import the open document spreadsheet and it came in fine. This further supports the possibility of it being a big in the csv import functionality.

Regards, Barry.

Discussion

  • Madhura Jayaratne

    Hi Barry,
    Can you attach the.csv file or a portion of it, if it does not contain any sensitive data. It will be of great help for anybody who's looking into this.

     
  • Madhura Jayaratne

    • status: open --> pending
     
  • Comment has been marked as spam. 
    Undo

    You can see all pending comments posted by this user  here

    Anonymous - 2011-03-04

    tab delimited text file which phpmyadmin is having problems with.

     
  • Comment has been marked as spam. 
    Undo

    You can see all pending comments posted by this user  here

    Anonymous - 2011-03-04

    Hi madhuracj,
    Thanks for your reply, I have uploaded the file now and have removed the sensitive data. I retested the file to confirm it is still causing import failure and indeed it is. Once again to clarify, if you save this txt file as .csv it still fails, .txt format should be fine anyway. But if you save it as open document spreadsheet and then try to import it works fine. This would seem to point to an error in the csv import section. Thanks for the help I appreciate it.
    Regards,
    Barry.

     
  • Madhura Jayaratne

    • assigned_to: nobody --> madhuracj
    • status: pending --> open
     
  • Madhura Jayaratne

    • status: open --> pending
     
  • Madhura Jayaratne

    CSV import works fine for me. Did you indicate the correct delimiter?

    In the CSV import window, after you have chosen the file,
    1) Change 'Fields terminated by' to '\t' to indicate that the delimiter is the tab character.
    2) tick 'Column names in first row'
    and press 'Go'.

     
  • Comment has been marked as spam. 
    Undo

    You can see all pending comments posted by this user  here

    Anonymous - 2011-03-04

    Hmmm that is strange. That's the way I was doing it from the start using \t as the delimiter with column names in the first row. I work as a php developer and dba professionally so I deal with both databases and php and am well used to importing often large files on a regular basis normally without issue. I just figured out what the problem may have been. The files I deal with 99% of the time are tab delimited text files. They don't have the text fields enclosed in quotation marks nor do they have escape clauses in them as I get them from non technical providers and have to import them and clean them before building applications around them. As such I had specified \t for the field termination value but cleared the "enclosed by" and "escaped by" fields as they did not apply to my file. This was indeed what was causing the issue. I just tested the theory here using a file with quotation marks and escape characters and with a file without them which is just plain tab delimited file. It worked fine with the "enclosed by" and" escaped by" fields left set to their default values but it failed when I cleared them even on the file that does not use them. Its working now so that's great. Perhaps it might be an idea that a message is put on the csv import area telling the user to leave the "escaped by" and "enclosed by" fields set even if their file is not using the enclosing quotation marks or escape characters. Thanks for the help at least we got to the bottom of it.
    Regards,
    Barry.
    Thanks,
    Barry.

     
  • Madhura Jayaratne

    • priority: 5 --> 1
    • summary: bug in csv import functionality --> (ok 3.4.0-beta4) bug in csv import functionality
    • status: pending --> open
     
  • Madhura Jayaratne

    This bug was fixed in repository and will be part of a future release; thanks for reporting.

     
  • Madhura Jayaratne

    • status: open --> open-fixed
     
  • Marc Delisle

    Marc Delisle - 2011-03-12
    • status: open-fixed --> closed-fixed
     
  • Michal Čihař

    Michal Čihař - 2013-06-11
    • Status: closed-fixed --> fixed