how to skip duplicate records when importing in phpmyadmin
Solution 1
In phpMyAdmin , in Settings tab, you can try checking the following values:
- Settings -> SQL Queries -> Ignore multiple statement errors
If you are using CSV format:
- Settings -> Import -> CSV -> Do not abort on INSERT error
If you are using SQL format:
- Settings -> Export -> SQL -> Use ignore inserts
Solution 2
There are a couple of ways to do what you want:
The brutal way:
TRUNCATE TABLE yourTbl; -- emtpies out the table
Then import, but you might loose data, so perhaps create a backup table. All things considered, just don't do this, check the alternatives listed below:
Write your own INSERT
query, with IGNORE
clause:
INSERT IGNORE INTO yourTbl -- as shown in the linked duplicate
But, since you are importing a file, the query will, most likely be a LOAD DATA [LOCAL] INFILE
. As you can see in the manual, you can easily add an IGNORE
to that query, too:
LOAD DATA LOCAL INFILE '/path/to/files/filename1.csv' IGNORE -- IGNORE goes here
INTO TABLE your_db.your_tbl
FIELDS TERMINATED BY ';'
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
(`field1`,`field2`);
That's it. Don't worry if you're not too comfortable writing your own queries like this, there are other ways of doing what you want to do:
The CLI way:
mysqlimport -i dbname fileToImport
# Or
mysqlimport --ignore dbname fileToImport
Also CLI, create a file containing the LOAD DATA
query above, then:
$: mysql -u root -p
*********** #enter password
mysql> source /path/to/queryFile.sql
This requires you to have access to the command line, and run this command Here's the manual page of MySQL
Using phpMyAdmin, when importing, you'll find a checkbox saying "Ignore duplicates", check that and import. Here's a page with screenshots
You could also choose to check "Ignore errors", but that's another brute-force approach, and I wouldn't recommend that.
Jo E.
Updated on August 29, 2020Comments
-
Jo E. over 3 years
I have a db on my local machine and I want to import the data to the db on my hosting. Both db's are identical, the same
table names
,column names
, etc.When I
export
the table from my local db throughphpmyadmin
andimport
it through phpmyadmin on my hosting an error pops up telling me that there are duplicate entries for theprimary key
and stops the whole operation.How can I import the data through phpmyadmin, skip the duplicate entries, and display a list of the duplicates at the end of the process?
A solution that I can do is call all the values of the primary key in the db at my hosting and filter the duplicates before import. BUT I am wondering if there is a quick solution for this with phpmyadmin?
-
Karl over 10 yearsDuplicate of: stackoverflow.com/questions/9919105/…
-
Jo E. over 10 years@Karl I saw that post. But I don't see an area in the phpmyadmin import tab where I can type in
sql
code? -
Elias Van Ootegem over 10 yearsSo all you want to do is
IGNORE
duplicates onINSERT
(combine the two, and you getINSERT IGNORE
) -
Karl over 10 years@joespina you'd need to edit the file you're trying to import.
-
-
Karl over 10 yearsnl-x, please reference the "INSERT IGNORE" method in your answer too just in case someone truly is searching for it. Just to stop anyone else submitting another answer.
-
nl-x over 10 years@Karl: I don't understand what you mean. the PhpMyAdmin Export Ignore Insert is the third bullet point in my answer.
-
Elias Van Ootegem over 10 years@nl-x: I think Karl meant that, for your answer to be more complete (and relevant for future reference) you might want to consider adding the actual queries that one can use to do this. Check my answer, I linked to a page with screens on how to do this in phpMyAdmin, but I've spent more (too much, in fact) time/effort on explaining the various queries one can use to import a file into a table without duplicate-issues
-
nl-x over 10 years@EliasVanOotegem I focused on the OP question. And even then actually only the first two bullet points are a correct answer, as the OP in the end wants to have a list of the duplicates. I think the third bullet point, as well as the Karl suggestion will not yield a list of 'warnings'. Furthermore, the third bullet point in fact does exactly what Karl suggests, only no file editing is involved anymore
-
Elias Van Ootegem over 10 years@nl-x: I'm not saying your answer is incorrect... And I missed the bit saying the OP wanted to see the duplicates after the import, but that's my fault. All I meant to do was clarify Karl's comment: He wanted you to add the
INSERT IGNORE
to prevent this question from being flooded by "useINSERT IGNORE
to do this" answers. That's all. Yours does indeed solve the OP's problem, mine doesn't... I'm not contradicting that -
Karl over 10 yearsElias is completely right, I simply think it would be a more complete answer if you explained the method of simply adding
IGNORE
as this would work as well, because you may only want it to effect a certain query, not for all databases, thus changing the settings wouldn't be suffice. Your method certainly works, but might as well add that method too for a complete answer, and possibly even a positive vote ;) -
smehsoud over 7 yearsThanks Nice Information. Your answer help.My problem was uploading CSV file and i make one column unique so it generates error while uploading a file having duplicate value and stops the uploading.Thanks