How to skip row when importing bad MySQL dump
Solution 1
If you can make the dump again you could add --insert-ignore
to the command-line when dumping.
Or you can try using the mysqlimport command with --force
,which will continue even if it encounters MySQL Errors.
Solution 2
mysql -f -p < 2010-12-01.sql
the -f
(force) being the operative option here, worked for me.
Solution 3
Following the advice from jmlsteele's answer and comment, here's how to turn the inserts into INSERT IGNORE
on the fly.
If you're importing from an sql
file:
sed -e "s/^INSERT INTO/INSERT IGNORE INTO/" < 2010-12-01.sql | mysql -p
If you're importing from a gz
file, just pipe the output from gunzip into sed instead of using the file input:
gunzip < 2010-12-01.sql.gz | sed -e "s/^INSERT INTO/INSERT IGNORE INTO/" | mysql -p
Solution 4
Great tip. I did it a little different but same result.
perl -pi -e 's/INSERT INTO/INSERT IGNORE INTO/g' filename.sql
Solution 5
The other options certainly are viable options, but another solution would be to simply edit the .sql
file obtained from the mysqldump
.
Change:
INSERT INTO table_name ...
TO
INSERT IGNORE INTO table_name ...
Comments
-
Almad almost 2 years
Given bad
mysqldump
that causes error on import:namtar backups # mysql -p < 2010-12-01.sql Enter password: ERROR 1062 (23000) at line 8020: Duplicate entry 'l�he?' for key 'wrd_txt'
Is there an easy way to tell import to just skip given row and continue?
(Yes, I know I can manually edit the file or parse output, but it's not very convinient)
-
Almad over 12 yearsI cannot create export again...and how do I use mysqlimport agains output from mysqldump --all-databases ?
-
jmlsteele over 12 yearsI just tried using --force with mysql and it causes the insertion to stop when the error occurs. mysqlimport doesn't work with an --all-databases dump, so that's out of the question as well. The easiest way would be to edit the file, not to remove the offending line(s) which could be tedious, but to turn the "INSERT" commands into "INSERT IGNORE" commands. A simple find/replace should suffice. I realise it's not an ideal solution, but it's better than having to dig through the file replacing offending inserts manually when you find out about them.
-
zhihong about 11 yearsIn my case, I restored .sql file in mysql cluster 5.5.29-7.2.10, -f also works. It lists all the ERROR lines, and this can also ignore the error like: ERROR 1528 (HY000) at line 22: Failed to create LOGFILE GROUP, and reuse the LOGFILE already created.
-
Almad almost 11 yearsWouldn't this also replace "INSERT INTO" texts inside text fields?
-
Ben almost 11 years@Almad - It would only change the first INSERT INTO, and any additional on the same line would not be changed. As far as I'm aware, any time you'd have "INSERT INTO" in a text field it would be following an INSERT INTO statement -- But, I changed the answer to make sure its only replaced when its at the beginning of a new line.
-
reallynice over 8 yearsA dump can have multiple INSERT statements (even thousands or more) per-row: in that case the solution wouldn't affect the other statements after the first. Maybe with a more complex regex it's possible to achieve the result, substituting only the INSERT INTO not in text fields.
-
PromInc about 7 years@Almad the solution you linked allows you to make the same change I proposed from the command line at the time of importing and does not modify the .sql file. My suggestion here is to actually open the file, edit its contents and save the file before importing. While similar I don't feel these are duplicate answers.