how to skip duplicate records when importing in phpmyadmin

57,732

Solution 1

In phpMyAdmin , in Settings tab, you can try checking the following values:

  • Settings -> SQL Queries -> Ignore multiple statement errors

If you are using CSV format:

  • Settings -> Import -> CSV -> Do not abort on INSERT error

If you are using SQL format:

  • Settings -> Export -> SQL -> Use ignore inserts

Solution 2

There are a couple of ways to do what you want:

The brutal way:

TRUNCATE TABLE yourTbl; -- emtpies out the table

Then import, but you might loose data, so perhaps create a backup table. All things considered, just don't do this, check the alternatives listed below:

Write your own INSERT query, with IGNORE clause:

INSERT IGNORE INTO yourTbl -- as shown in the linked duplicate

But, since you are importing a file, the query will, most likely be a LOAD DATA [LOCAL] INFILE. As you can see in the manual, you can easily add an IGNORE to that query, too:

LOAD DATA LOCAL INFILE '/path/to/files/filename1.csv' IGNORE -- IGNORE goes here
    INTO TABLE your_db.your_tbl
        FIELDS TERMINATED BY ';'
               OPTIONALLY ENCLOSED BY '"'
        LINES TERMINATED BY '\n'
    (`field1`,`field2`);

That's it. Don't worry if you're not too comfortable writing your own queries like this, there are other ways of doing what you want to do:
The CLI way:

mysqlimport -i dbname fileToImport
# Or
mysqlimport --ignore dbname fileToImport

Also CLI, create a file containing the LOAD DATA query above, then:

$: mysql -u root -p
*********** #enter password
mysql> source /path/to/queryFile.sql

This requires you to have access to the command line, and run this command Here's the manual page of MySQL

Using phpMyAdmin, when importing, you'll find a checkbox saying "Ignore duplicates", check that and import. Here's a page with screenshots
You could also choose to check "Ignore errors", but that's another brute-force approach, and I wouldn't recommend that.

Share:
57,732
Jo E.
Author by

Jo E.

Updated on August 29, 2020

Comments

  • Jo E.
    Jo E. over 3 years

    I have a db on my local machine and I want to import the data to the db on my hosting. Both db's are identical, the same table names, column names, etc.

    When I export the table from my local db through phpmyadmin and import it through phpmyadmin on my hosting an error pops up telling me that there are duplicate entries for the primary key and stops the whole operation.

    How can I import the data through phpmyadmin, skip the duplicate entries, and display a list of the duplicates at the end of the process?

    A solution that I can do is call all the values of the primary key in the db at my hosting and filter the duplicates before import. BUT I am wondering if there is a quick solution for this with phpmyadmin?

    • Karl
      Karl over 10 years
    • Jo E.
      Jo E. over 10 years
      @Karl I saw that post. But I don't see an area in the phpmyadmin import tab where I can type in sql code?
    • Elias Van Ootegem
      Elias Van Ootegem over 10 years
      So all you want to do is IGNORE duplicates on INSERT (combine the two, and you get INSERT IGNORE)
    • Karl
      Karl over 10 years
      @joespina you'd need to edit the file you're trying to import.
  • Karl
    Karl over 10 years
    nl-x, please reference the "INSERT IGNORE" method in your answer too just in case someone truly is searching for it. Just to stop anyone else submitting another answer.
  • nl-x
    nl-x over 10 years
    @Karl: I don't understand what you mean. the PhpMyAdmin Export Ignore Insert is the third bullet point in my answer.
  • Elias Van Ootegem
    Elias Van Ootegem over 10 years
    @nl-x: I think Karl meant that, for your answer to be more complete (and relevant for future reference) you might want to consider adding the actual queries that one can use to do this. Check my answer, I linked to a page with screens on how to do this in phpMyAdmin, but I've spent more (too much, in fact) time/effort on explaining the various queries one can use to import a file into a table without duplicate-issues
  • nl-x
    nl-x over 10 years
    @EliasVanOotegem I focused on the OP question. And even then actually only the first two bullet points are a correct answer, as the OP in the end wants to have a list of the duplicates. I think the third bullet point, as well as the Karl suggestion will not yield a list of 'warnings'. Furthermore, the third bullet point in fact does exactly what Karl suggests, only no file editing is involved anymore
  • Elias Van Ootegem
    Elias Van Ootegem over 10 years
    @nl-x: I'm not saying your answer is incorrect... And I missed the bit saying the OP wanted to see the duplicates after the import, but that's my fault. All I meant to do was clarify Karl's comment: He wanted you to add the INSERT IGNORE to prevent this question from being flooded by "use INSERT IGNORE to do this" answers. That's all. Yours does indeed solve the OP's problem, mine doesn't... I'm not contradicting that
  • Karl
    Karl over 10 years
    Elias is completely right, I simply think it would be a more complete answer if you explained the method of simply adding IGNORE as this would work as well, because you may only want it to effect a certain query, not for all databases, thus changing the settings wouldn't be suffice. Your method certainly works, but might as well add that method too for a complete answer, and possibly even a positive vote ;)
  • smehsoud
    smehsoud over 7 years
    Thanks Nice Information. Your answer help.My problem was uploading CSV file and i make one column unique so it generates error while uploading a file having duplicate value and stops the uploading.Thanks