Import sql file in node.js and execute against PostgreSQL

41,511

Solution 1

I've written the following function which works for my case. It would have been much more simpler if it weren't for:

  • Using batch to manage concurrency
  • Having the tricky PostgreSQL COPY case to consider

Code snippet:

function processSQLFile(fileName) {

  // Extract SQL queries from files. Assumes no ';' in the fileNames
  var queries = fs.readFileSync(fileName).toString()
    .replace(/(\r\n|\n|\r)/gm," ") // remove newlines
    .replace(/\s+/g, ' ') // excess white space
    .split(";") // split into all statements
    .map(Function.prototype.call, String.prototype.trim)
    .filter(function(el) {return el.length != 0}); // remove any empty ones

  // Execute each SQL query sequentially
  queries.forEach(function(query) {
    batch.push(function(done) {
      if (query.indexOf("COPY") === 0) { // COPY - needs special treatment
        var regexp = /COPY\ (.*)\ FROM\ (.*)\ DELIMITERS/gmi;
        var matches = regexp.exec(query);
        var table = matches[1];
        var fileName = matches[2];
        var copyString = "COPY " + table + " FROM STDIN DELIMITERS ',' CSV HEADER";
        var stream = client.copyFrom(copyString);
        stream.on('close', function () {
          done();
        });
        var csvFile = __dirname + '/' + fileName;
        var str = fs.readFileSync(csvFile);
        stream.write(str);
        stream.end();
      } else { // Other queries don't need special treatment
        client.query(query, function(result) {
          done();
        });
      }
    });
  });
}

Beware that this would fail if you used semicolons anywhere except to terminate SQL statements.

Solution 2

You can just separate consequent queries with a semicolon when passed to client.query

That works:

var pg = require('pg');

pg.connect('postgres://test:test@localhost/test', function(err, client, done){
        client.query('CREATE TABLE test (test VARCHAR(255)); INSERT INTO test VALUES(\'test\') ');
        done();
});

And consequently, that works too:

var pg = require('pg');
var fs = require('fs');

var sql = fs.readFileSync('init_database.sql').toString();

pg.connect('postgres://test:test@localhost/test', function(err, client, done){
    if(err){
        console.log('error: ', err);
        process.exit(1);
    }
    client.query(sql, function(err, result){
        done();
        if(err){
            console.log('error: ', err);
            process.exit(1);
        }
        process.exit(0);
    });
});

Solution 3

The @databases/pg client supports running SQL files out of the box:

const createPool = require('@databases/pg');
const {sql} = require('@databases/pg');

const db = createPool();

db.query(sql.file('my-file.sql')).catch(ex => {
  console.error(ex);
  process.exitCode = 1;
}).then(() => db.dispose());

It also supports having multiple statements in a single call to db.query:

const createPool = require('@databases/pg');
const {sql} = require('@databases/pg');

const db = createPool();

db.query(sql`
  INSERT INTO users (name) VALUES (${'Forbes'});
  SELECT * FROM users;
`)).then(
  results => console.log(results)
).catch(ex => {
  console.error(ex);
  process.exitCode = 1;
}).then(() => db.dispose());

In this example, each statement is run in sequence, and the result of the last statement is returned.

Share:
41,511
rgareth
Author by

rgareth

Updated on July 30, 2022

Comments

  • rgareth
    rgareth over 1 year

    I'm looking for an efficient way to take a raw sql file and have it executed synchronously against a postgres database, akin to if you ran it through psql.

    I have an sql file which creates all databases, imports data, etc. I need to execute this using node.js but cannot find any module which does this automatically. For the node.js application itself, we use node-postgres ('pg'), knex.js and bookshelf.js. I assume though that pg is best for this.

    One alternative I can think of is to read the full file, split it by semicolons, replace newlines with spaces, trim any duplicate space, then feed it into pg one by one in a manner that they're executed sequentially, not asynchronously. I'm a little surprised if this is truly the most efficient way and also if no libraries exist yet to solve this. I'm a little hesitant to jump into it seeing as SQL syntax can itself be a little challenging and I might accidentally mash it up.

    Some clarifications in advance:

    • psql cannot be used as it's not installed on the target machine
    • I've chosen to develop and source control sql statements in sql native form, because it's a lot easier for a DBA to use and manipulate it
  • rgareth
    rgareth about 10 years
    I tried this initially but found problems: * Newlines and tabs found traditionally in SQL files could not be stuffed into client.query, (resolvable with some extra steps) * Very difficult error handling per-query (which I have admittedly also left out of my example here)
  • OhJeez
    OhJeez about 10 years
    @rgareth I had completely no problems with newlines and tabs in SQL files whatsoever. You can add some \ns (even \r\ns) and \ts to my example and it will still work.
  • user2670818
    user2670818 over 8 years
    that worked for me for ibm_db lib. I was facing the same problem also, but one question raised up to my mind. What about SQL Injection?
  • user7740901
    user7740901 almost 8 years
    Can you add details on what batch is?
  • Alexander  Pravdin
    Alexander Pravdin over 4 years
    What about importing scripts with functions?
  • tukusejssirs
    tukusejssirs over 3 years
    while your answer helped me, you could mention that @databases packages are created by you. :smiley: … Also the docs (at your website) could be improve to contain more examples, e.g. I (as a Node JS newbie) had trouble to get where to put the connection options, because all your examples use a single string in createConnectionPool(), however, I can put there an object with the supported connection options. Keep up the good work!
  • ForbesLindesay
    ForbesLindesay about 3 years
    Thanks for the feedback @tukusejssirs If you have suggestions for improvements to the docs, I'd love it if you could submit at PR. All the files you would need to edit are here: github.com/ForbesLindesay/atdatabases/tree/master/docs
  • Lee Goddard
    Lee Goddard almost 2 years
    You ought to edit your answer to mention you are the author of the packages you are recommending.