Datadump Fails with Exit Code 1

21,508

Solution 1

I didn't get all the way down to root cause on this, but I found the source of the problem. I had one very large table of attachments being written with INSERT VALUES. I removed this table and then tried to just run as a SQL script. But I got a new error (which escapes me right now.) Googling the new error told me the script was too long to run without increasing the max_allowed_packet variable. Turning this up to a really large number allowed the script to run.

Solution 2

Do you have any foreign key constraints in your database? If you do you have to make sure that you import the tables in an order that allows the foreign keys to be enforced. If you try to create a table that has a foreign key that references a table that hasn't been created yet you will get this kind of error. If this is the problem you will either have to manipulate your dump file to get it in the "correct" order or you will have to redo the dump, one table at a time in an order that will be acceptable. I have a script that does this for my DB, but I have never tried to do it through phpMyAdmin.

Share:
21,508
Ruser
Author by

Ruser

Production Support Analyst for a small software company.

Updated on September 17, 2022

Comments

  • Ruser
    Ruser over 1 year

    I am in mySQL Workbench trying to import a datadump in a .sql file (generated by phpMyAdmin) into mySQL (server administration>data dump>import from disk), and it keeps failing with the following output:

    10:47:14 Restoring C:\myfolder\localhost.sql ( )

    Running: mysql.exe --host=127.0.0.1 --user=root --password=xxxxxxx --port=3306 --comments< C:\myfolder\localhost.sql

    Operation failed with exitcode 1

    10:47:15 Import of C:\myfolder\localhost.sql has finished

    When I go view the schema, it looks like it imported 6 tables (of 50ish) and almost no data at all.

    Errata:

    1. There don't appear to be any contraints or relations ships. All the data is being added via several DROP TABLE IF EXISTS; CREATE TABLE; INSERT INTO series with hard-coded values.
    2. The file has a substantial number of large chunks. And the file itself is over 400 MB.
    3. When I manually checked to see if the effect of any clause of the file is working, I discovered it is getting hung up on the INSERT INTO sql with the VERY large hardcoded chunks. Not sure if that's significant or not?
    4. I also noticed that some of the chunks have a length greater than 32768 is there an upperbound on chunk length?
  • Ruser
    Ruser over 14 years
    Hi Catherine, Thanks for taking an interest:) I added more info to my post in hopes that would be helpful.
  • Catherine MacInnes
    Catherine MacInnes over 14 years
    No problem. I'm glad you figured it out.
  • Steven Evers
    Steven Evers over 13 years
    I have come across this before, and in my case, increasing the packet size in the network settings for the instance to a larger value solved the problem. I forget how I came to the solution though... but I recall there being sound reasoning :)