Exporting Large MySql Table

10,200

Solution 1

Honestly for this size of data, I would suggest doing a mysqldump then importing the table back into another copy of MySQL installed somewhere else, maybe on a virtual machine dedicated to this task. From there, you can set timeouts and such as high as you want and not worry about resource limitations blowing up your production database. Using nice on Unix-based OSes or process priorities on Windows-based system, you should be able to do this without too much impact to the production system.

Alternatively, you can set up a replica of your production database and pull the data from there. Having a so-called "reporting database" that replicates various tables or even entire databases from your production system is actually a fairly common practice in large environments to ensure that you don't accidentally kill your production database pulling numbers for someone. As an added advantage, you don't have to wait for a mysqldump backup to complete before you start pulling data for your boss; you can do it right away.

Solution 2

Is your boss Rain Man? Does he just spot 'information' in raw 'data'?

Or does he build functions in excel rather than ask for what he really needs?

Sit with him for an hour and see what he's actually doing with the data. What questions are being asked? What patterns are being detected (manually)? Write a real tool to find that information and then plug it into your monitoring/alerting system.

Or, get a new boss. Seriously. You can tell him I said so.

Solution 3

The rows are not too many. Run a query to export your data into a csv file:

SELECT column,column2 INTO OUTFILE '/path/to/file/result.txt'
  FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
  LINES TERMINATED BY '\n'
  FROM yourtable WHERE columnx = 'çondition';
Share:
10,200
Chris Nelson
Author by

Chris Nelson

Updated on June 09, 2022

Comments

  • Chris Nelson
    Chris Nelson almost 2 years

    I have a table in MySql that I manage using PhpMyAdmin. Currently it's sitting at around 960,000 rows.

    I have a boss who likes to look at the data in Excel, which means weekly, I have to export the data into Excel.

    I am looking for a more efficient way to do this. Since I can't actually do the entire table at once, because it times out. So I have been stuck with 'chunking' the table into smaller queries and exporting it like that.

    I have tried connecting Excel (and Access) directly to my database, but same problem; it times out. Is there any way to extend the connection limit?