Importing JSON into Mysql

89,731

Solution 1

You can export your json to csv : http://www.danmandle.com/blog/json-to-csv-conversion-utility/ or https://github.com/danmandle/JSON2CSV

Then :

LOAD DATA INFILE 'filepath/your_csv_file.csv' INTO TABLE tablename;

It should be okay for a oneshot.

More info for load data infile.

Solution 2

Like others have said, you are going to have to do a little bit of conversion to do what you're asking. Fortunately, it's not too difficult to have PHP iterate through the data and do most of the work for you. Here's a quick-and-dirty attempt:

// MySQL table's name
$tableName = 'my_table';
// Get JSON file and decode contents into PHP arrays/values
$jsonFile = '/path/to/file.json';
$jsonData = json_decode(file_get_contents($jsonFile), true);

// Iterate through JSON and build INSERT statements
foreach ($jsonData as $id=>$row) {
    $insertPairs = array();
    foreach ($row as $key=>$val) {
        $insertPairs[addslashes($key)] = addslashes($val);
    }
    $insertKeys = '`' . implode('`,`', array_keys($insertPairs)) . '`';
    $insertVals = '"' . implode('","', array_values($insertPairs)) . '"';

    echo "INSERT INTO `{$tableName}` ({$insertKeys}) VALUES ({$insertVals});" . "\n";
}

This will spit out a series of INSERT INTO my_table (...) VALUES (...); SQL statements that you can save to a .sql file and import into MySQL. With this method each INSERT can have different columns without causing problems.

Note: addslashes() isn't a very secure way of doing this, and something like real_escape_string() would be preferable, but this works without a DB connection assuming your JSON data is trustworthy.

Solution 3

Using MySQL Shell 8.0.13 you can import JSON documents straight to MySQL Server using util.importJson shell API function or command line option --import. For example import from file:

mysqlsh user@host:port/database --import /tmp/facebook.json collection_name

or pipe JSON document to stdin:

cat /tmp/facebook.json | mysqlsh user@host:port/database --import - collection_name

where collection_name is a collection.

More about MySQL Shell's JSON Import utility: https://dev.mysql.com/doc/mysql-shell/8.0/en/mysql-shell-utilities-json.html

More about document store: https://dev.mysql.com/doc/refman/8.0/en/document-store.html

More about documents and collections: https://dev.mysql.com/doc/refman/8.0/en/mysql-shell-tutorial-javascript-documents-collections.html

Solution 4

The following was tested and works on MySQL 8.0.18. I expect it to work on any version that supports JSON data type and load data statement.

The JSON file can be loaded using load data statement from any SQL client. Assuming the user has proper permission to run the statement and proper access to the file.

create table jsonTableName( jsonColumnName json );

load data infile '/var/lib/mysql-files/someFile.json'
  into table jsonTableName
  fields terminated by '\0' escaped by ''
  lines terminated by '\0';

Fields and lines terminated by '\0' disable fields and lines parsing, thus assuming that the entire file is a single cell.

escaped by '' disables the \ to be interpreted by the loader. Thus allowing \" to be send directy to the JSON parser. Without it, any escape sequence will be loaded improperly potentially causing an exception.

Please use load data LOCAL infile (as @Vijay suggested) when the JSON file is located on your local machine and the MySQL server is not (for remote transfers). Keep in mind that local works only if the server and your client both have been configured to permit it. Use select @@local_infile; to check the server settings (0 is for not allowed and 1 for allowed). For MySQL Workbench client add OPT_LOCAL_INFILE=1 in Connection -> Advanced -> Others. For IntelliJ/Datagrip client set allowLoadLocalInfile to true in the Advanced connection settings.

Share:
89,731
John
Author by

John

Updated on June 03, 2021

Comments

  • John
    John almost 3 years

    the problem that I am facing is importing JSON into Mysql. I have searched the interwebs, I have searched stackoverflow and probably even more sources, but could not find a single proper solution. Full disclosure here - I am not a PHP, SQL nor am I a JSON professional.

    What I am trying to accomplish is fairly simple: Import this JSON file https://developers.facebook.com/tools/explorer/method=GET&path=245226895508982%2Ffeed%3Faccess_token%3D%3Caccess_token%3E= into a Mysql database. I understand that I need so somehow allocate the JSON data into columns and then curate the rows accordingly with the content. I do not need all of the data, but most of it. The data should later be echoed through a search query from a user.

    My question is seems fairly simple - how can I do it?

    P.S: I tried converting it into XML, but I do not have the sufficient rights to import the XML into the SQL database. Also the converter did not the best job.

  • John
    John almost 11 years
    Thanks! I tried that one too. This is what is happening: `<br /> <b>Warning</b>: array_keys() [<a href='function.arraykeys'>function.array-keys</a>]: The first argument should be an array in<b>/home/mandledp/danmandle.com/projects/JSON2CSV/json2csv‌​.class.php</b> on line <b>20</b><br /> <br /> <b>Warning</b>: Invalid argument supplied for foreach() in <b>/home/mandledp/danmandle.com/projects/JSON2CSV/json2csv.c‌​lass.php</b> on line <b>20</b><br /> ... I found this to be an excellent solution json-csv.com/Default.aspx. However they charge you 4 USD acopy, which is obviously not feasible.
  • kmas
    kmas almost 11 years
    Show us the first lines of your JSON file.
  • John
    John almost 11 years
    { "id": "245226895508982_585115708186764", "from": { "name": "Ma Mei", "id": "100005547752625" }, "to": { "data": [ { "name": "Wohnung/WG in München gesucht!", "id": "245226895508982" } ] }, "message": "Hallo,\n\nsuchen eine 3 oder 4 Zimmer Wohnung", "privacy": { "value": "" }, "type": "status", "created_time": "2013-06-26T21:44:27+0000", "updated_time": "2013-06-26T21:44:27+0000" }
  • kmas
    kmas almost 11 years
    Ok, so I think that with this JSON file, it won't be easy to insert data in your database. It is not an array of cell{key1:value1, key2:value2}. You'll have to extract, transform and then load your data ==> check ETL in google if it can help you (pentaho is a free ETL tool)
  • isgoed
    isgoed over 5 years
    The JSON2CSV failed for me with error: "JSON is either invalid or has nested elements.", but it is just a plain flat JSON object with key-values. I found out that excel (for Office 365) can convert JSON to CSV as well. I ran the command as suggested, but found out the file has to be in directory SHOW VARIABLES LIKE "secure_file_priv";. Then lastly the file should not have been comma-separated, but tab-separated (some user reported even semicolon-separated).
  • Ly Thanh Ngo
    Ly Thanh Ngo over 4 years
    Add this to your code, it will NULL for value empty $insertVals = str_replace("''", 'NULL', $insertVals);
  • Vijay
    Vijay almost 3 years
    This works and helps to store the data as a json data type column...Command has to start with LOAD DATA LOCAL INFILE
  • user9526573
    user9526573 almost 3 years
    You are correct, @Vijay. While testing the JSON file was on the same machine as the MySQL server, hence the local was not required. Moreover local is a rather messy option with a number of potential pitfalls, that made me want to stay clear from it. Although it is a very useful option and I added a paragraph with a bit more info about it.
  • The Onin
    The Onin about 2 years
    Too bad X Dev protocol doesn't work on RDS.
  • kgr
    kgr about 2 years
    If your input file has one json document per line, you can try to use util.importTable with dialect option set to json.