MySql Bulk insert

22,214

Solution 1

You could write a single insert query that would do several inserts in a single call to the database :

insert into your_table (field1, field2, field3)
values 
  (value1_1, value1_2, value1_3), 
  (value2_1, value2_2, value2_3), 
  (value3_1, value3_2, value3_3)


Here, with the example I've given, this single query would have inserted three rows in the table.

Solution 2

MySQL's LOAD DATA command might be useful to you: http://dev.mysql.com/doc/refman/5.5/en/load-data.html

With reference to Pascal's suggestion unless your command exceeds max_allowed_packet then you should be able to execute this query. In many cases it works best to create few smaller inserts with say 1000 rows in each.

Solution 3

You can execute your statements in batch, some code example can be found here.

Also, setAutoCommit(false), and call conn.commit(); after executeBatch() to minimise the number of commits.

Solution 4

Insert bulk more than 7000000 record in 1 minutes in database(superfast query with calculation)

    mysqli_query($cons, '
    LOAD DATA LOCAL INFILE "'.$file.'"
    INTO TABLE tablename
    FIELDS TERMINATED by \',\'
    LINES TERMINATED BY \'\n\'
    IGNORE 1 LINES
    (isbn10,isbn13,price,discount,free_stock,report,report_date)
     SET RRP = IF(discount = 0.00,price-price * 45/100,IF(discount = 0.01,price,IF(discount != 0.00,price-price * discount/100,@RRP))),
         RRP_nl = RRP * 1.44 + 8,
         RRP_bl = RRP * 1.44 + 8,
         ID = NULL
    ')or die(mysqli_error());
    $affected = (int) (mysqli_affected_rows($cons))-1; 
    $log->lwrite('Inventory.CSV to database:'. $affected.' record inserted successfully.');

RRP and RRP_nl and RRP_bl is not in csv but we are calculated that and after insert that.

Share:
22,214

Related videos on Youtube

Amit
Author by

Amit

Updated on February 17, 2020

Comments

  • Amit
    Amit about 4 years

    I want to insert some 4K rows in the MySql db. I don't want to fire 4k 'insert' queries. Is there any way by which I can fire only one insert query to store those 4k rows in the db.

    I searched on internet and everywhere I found that the users are doing bulk insert into the db from a file.

    In my case, I have the data in the memory and I don't want to first write that data to a file to do bulk insert. If I do that then I will add delay in the program.

  • Amit
    Amit about 13 years
    I think that the db put some constraints over the length of the query. In case of 3 rows, it will work but in case of 4K rows I don't think that it will work
  • Nick Bolton
    Nick Bolton almost 12 years
    This worked for me when I tried inserting 1000 rows (8 x int, 2 x string). But when I tried 2000 rows, I got a "MySQL server has gone away" error. To fix this, I will try to insert 1000 rows at a time like James C suggests.
  • Pacerier
    Pacerier almost 12 years
    @Pascal Is this atomic? Is it possible that if the db crashes halfway, only two out of three rows are inserted?
  • Sebastian Mach
    Sebastian Mach over 10 years
    @YatendraGoel: You can still partition your set of single-element-inserts into n-(4K/n)-elements-inserts.

Related