How to convert CSV to JSON in Node.js

151,224

Solution 1

Node.js csvtojson module is a comprehensive nodejs csv parser. It can be used as node.js app library / a command line tool / or browser with help of browserify or webpack.

the source code can be found at: https://github.com/Keyang/node-csvtojson

It is fast with low memory consumption yet powerful to support any of parsing needs with abundant API and easy to read documentation.

The detailed documentation can be found here

Here are some code examples:

Use it as a library in your Node.js application ([email protected] +):

  1. Install it through npm

npm install --save csvtojson@latest

  1. Use it in your node.js app:
// require csvtojson
var csv = require("csvtojson");

// Convert a csv file with csvtojson
csv()
  .fromFile(csvFilePath)
  .then(function(jsonArrayObj){ //when parse finished, result will be emitted here.
     console.log(jsonArrayObj); 
   })

// Parse large csv with stream / pipe (low mem consumption)
csv()
  .fromStream(readableStream)
  .subscribe(function(jsonObj){ //single json object will be emitted for each csv line
     // parse each json asynchronousely
     return new Promise(function(resolve,reject){
         asyncStoreToDb(json,function(){resolve()})
     })
  }) 

//Use async / await
const jsonArray=await csv().fromFile(filePath);

Use it as a command-line tool:

sh# npm install csvtojson
sh# ./node_modules/csvtojson/bin/csvtojson ./youCsvFile.csv

-or-

sh# npm install -g csvtojson
sh# csvtojson ./yourCsvFile.csv

For advanced usage:

sh# csvtojson --help

You can find more details from the github page above.

Solution 2

You can try to use underscore.js

First convert the lines in arrays using the toArray function :

var letters = _.toArray(a,b,c,d);
var numbers = _.toArray(1,2,3,4);

Then object the arrays together using the object function :

var json = _.object(letters, numbers);

By then, the json var should contain something like :

{"a": 1,"b": 2,"c": 3,"d": 4}

Solution 3

Had to do something similar, hope this helps.

// Node packages for file system
var fs = require('fs');
var path = require('path');


var filePath = path.join(__dirname, 'PATH_TO_CSV');
// Read CSV
var f = fs.readFileSync(filePath, {encoding: 'utf-8'}, 
    function(err){console.log(err);});

// Split on row
f = f.split("\n");

// Get first row for column headers
headers = f.shift().split(",");

var json = [];    
f.forEach(function(d){
    // Loop through each row
    tmp = {}
    row = d.split(",")
    for(var i = 0; i < headers.length; i++){
        tmp[headers[i]] = row[i];
    }
    // Add object to list
    json.push(tmp);
});

var outPath = path.join(__dirname, 'PATH_TO_JSON');
// Convert object to string, write json to file
fs.writeFileSync(outPath, JSON.stringify(json), 'utf8', 
    function(err){console.log(err);});

Solution 4

Here is a solution that does not require a separate module. However, it is very crude, and does not implement much error handling. It could also use more tests, but it will get you going. If you are parsing very large files, you may want to seek an alternative. Also, see this solution from Ben Nadel.

Node Module Code, csv2json.js:

/*
 * Convert a CSV String to JSON
 */
exports.convert = function(csvString) {
    var json = [];
    var csvArray = csvString.split("\n");

    // Remove the column names from csvArray into csvColumns.
    // Also replace single quote with double quote (JSON needs double).
    var csvColumns = JSON
            .parse("[" + csvArray.shift().replace(/'/g, '"') + "]");

    csvArray.forEach(function(csvRowString) {

        var csvRow = csvRowString.split(",");

        // Here we work on a single row.
        // Create an object with all of the csvColumns as keys.
        jsonRow = new Object();
        for ( var colNum = 0; colNum < csvRow.length; colNum++) {
            // Remove beginning and ending quotes since stringify will add them.
            var colData = csvRow[colNum].replace(/^['"]|['"]$/g, "");
            jsonRow[csvColumns[colNum]] = colData;
        }
        json.push(jsonRow);
    });

    return JSON.stringify(json);
};

Jasmine Test, csv2jsonSpec.js:

var csv2json = require('csv2json.js');

var CSV_STRING = "'col1','col2','col3'\n'1','2','3'\n'4','5','6'";
var JSON_STRING = '[{"col1":"1","col2":"2","col3":"3"},{"col1":"4","col2":"5","col3":"6"}]';

/* jasmine specs for csv2json */
describe('csv2json', function() {

    it('should convert a csv string to a json string.', function() {
        expect(csv2json.convert(CSV_STRING)).toEqual(
                JSON_STRING);
    });
});

Solution 5

Using ES6

const toJSON = csv => {
    const lines = csv.split('\n')
    const result = []
    const headers = lines[0].split(',')

    lines.map(l => {
        const obj = {}
        const line = l.split(',')

        headers.map((h, i) => {
            obj[h] = line[i]
        })

        result.push(obj)
    })

    return JSON.stringify(result)
}

const csv = `name,email,age
francis,[email protected],33
matty,[email protected],29`

const data = toJSON(csv)

console.log(data)

Output

// [{"name":"name","email":"email","age":"age"},{"name":"francis","email":"[email protected]","age":"33"},{"name":"matty","email":"[email protected]","age":"29"}]
Share:
151,224

Related videos on Youtube

Jetson John
Author by

Jetson John

Updated on December 11, 2021

Comments

  • Jetson John
    Jetson John over 2 years

    I am trying to convert csv file to json. I am using .

    Example CSV:

    a,b,c,d
    1,2,3,4
    5,6,7,8
    ...
    

    Desired JSON:

    {"a": 1,"b": 2,"c": 3,"d": 4},
    {"a": 5,"b": 6,"c": 7,"d": 8},
    ...
    

    I tried node-csv parser library.But the output is like array not like I expected.

    I'm using Node 0.8 and express.js and would like a recommendation on how to easily accomplish this.

  • Keyang
    Keyang almost 11 years
    Code has been added. See more detailed documentation here github.com/Keyang/node-csvtojson
  • Keyang
    Keyang about 10 years
    Since version 0.3.0, csvtojson does not depend on any other lib. It will behave like a proper Stream object.
  • E. Maggini
    E. Maggini over 9 years
    The link to the blog is dead.
  • Keyang
    Keyang over 9 years
    Updated. Thanks for letting me know.
  • limoragni
    limoragni almost 9 years
    I don't know if this is happening only to me, but for a large CSV file this is to slow. Like 10 seconds slower than d3
  • Spencer
    Spencer over 8 years
    async and underscore was too much for you?
  • xverges
    xverges over 8 years
    @Spencer, at the time that I posted, the dependencies were different: github.com/Keyang/node-csvtojson/blob/… Pulling in express for a csv conversion felt unnatural
  • Spencer
    Spencer over 8 years
    Oh, yeah that is a crazy dependency. My bad.
  • MrWiLofDoom
    MrWiLofDoom over 7 years
    this one didn't work as well as csvtojson. When I had "Aug 23, 2016" it split aug23 and 2016 into different fields
  • Jess
    Jess over 7 years
    Bummer. You could wrap the date in quotes to fix it?
  • arve0
    arve0 almost 7 years
    Or even var json = f.map(function(d, i){ ... return tmp; }
  • Max
    Max about 5 years
    The link to this library is dead - perhaps it was moved to somewhere else on Github (or forked?). Please update link.
  • Abdennour TOUMI
    Abdennour TOUMI almost 5 years
    Thank you @RohitParte . This is one of my first modules in NodeJs. While some features work fine, it is missing a lot of features. I become extremely busy with other things (Reliability Engineering, DevOps, .. so on).
  • Vikas Putcha
    Vikas Putcha over 4 years
    for ppl like me who are looking out for simpler version to read from web url, csv() .fromStream(request.get('example.com/test.csv')) .then((jsonArray) => { console.log(util.inspect(jsonArray)); }, (err)=>{ console.log(err); });
  • CharlesA
    CharlesA over 4 years
    A tweak: I added .map(function(str) { return _.trim(str, '"') to remove any double-quotes from the headers and data items, ie const header = content[0].split(',').map(function(str) { return _.trim(str, '"'); }); and return _.zipObject(header, row.split(',').map(function(str) { return _.trim(str, '"'); }));
  • Himanshu Tariyal
    Himanshu Tariyal almost 3 years
    what if the file is uploaded by the user? I have the file in the ' file ' variable and I want to convert it to JSON. How will I do that with csvtojson package ?
  • Tobiah Rex
    Tobiah Rex over 2 years
    unfortunately splitting on newline won't work if any cell has a newline character within it 😅
  • Lakshman Pilaka
    Lakshman Pilaka about 2 years
    solves 90% of the requirements
  • Green
    Green almost 2 years
    @HimanshuTariyal add this line const csvFilePath = <Your file path here> underneath var csv = require("csvtojson");
  • Sayf-Eddine
    Sayf-Eddine almost 2 years
    There is a leak of memory with your code