PostgreSQL Export CSV with Node

Introduction

Scalability describes frameworks built with NodeJS. When you integrate a PostgreSQL database, you’ve got a powerhouse combination to meet your developer-creating needs. Even so, you also require the ability to export various types of documents as different file types. As you know, CSV is one of the most common formats regularly utilized. Learn how to export effortlessly with this tutorial that explains how to perform a PostgreSQL export CSV node.

If you’re already familiar with the steps of exporting CSV with Node and want to bypass these steps, go to Just the Code.

Prerequisites to using PostgreSQL and Node.JS

  • Verify that you’ve installed PostgresSQL.

  • For Ubuntu and its distros, and Linux, the service postgresql status command will confirm an active status. Therefore, enter service postgresql status at the command line. After you see the status indication, exit by pressing CTRL+C.

  • Check for the installation of the command line psql -V interactive. Alternatively, input ps aux | grep postgres for macOS. If you have Linux, that command will work for it as well. The display will show all PostgreSQL processes that are active.

  • If you don’t have Node.js installed, do so now.

Screenshot of a terminal window getting the PostgreSQL server status and psql version

Check that Node and its ‘npm’ package manager are installed

  • At the command line, type node -v to identify the version of Node.js on your OS.

  • Next, type npm -v to confirm that the Node Package Manager is installed and running.

Use the Node Package Manager to install the ‘pg’ Postgres packages

  • Add the client pg to your Node.js dependencies.
npm i pg

Next, you’ll need to get the required CSV modules for the table data in PostgreSQL exporting via Node.js. The library modulesjson2csv, fast-csv and csv-writer.

Install the Fast CSV Node library for accessing CSV data in Node.js

Parsing data is the role of thefast-csv module. It does this by delimiting data values so the format becomes CSV-compliant.

  • At a terminal window, install fast-csv with either the command npm or npm i.
npm i fast-csv
  • After a successful installation, you should see a response like this one:
+ fast-csv@3.4.0
added 4 packages from 50 contributors and audited 245 packages in 5.098s
found 0 vulnerabilities

Inserting some records into PostgreSQL table

Make a sample database with records in a table to test out the examples given in this tutorial for learning PostgreSQL export CSV node.

  • To add records to a PostgreSQL sample_table, use the command psql:

NOTE: If you’ve created a sample database and table with records in it, feel free to skip adding more records to your table.

INSERT INTO sample_table(id, firstname, lastname, age, email)
VALUES(1, 'Golda', 'Kaniecki', 25, 'golda_kaniecki@yahoo.com'),
(2, 'Catarina', 'Gleich', 43, 'catarina_gleich@hotmail.com');
  • Test your table’s data by using the statement SELECT.
SELECT * FROM sample_table;
  • You should see a result similar to this one which shows records in sample_table:
id | firstname | lastname | age | email
----+-----------+----------+-----+-----------------------------
1 | Golda | Kaniecki | 25 | golda_kaniecki@yahoo.com
2 | Catarina | Gleich | 43 | catarina_gleich@hotmail.com
(2 ROWS)

Create the PostgreSQL Export CSV Node application

  • Go to the project directory. If you don’t have one, make one with the mkdir command.

  • Then, construct a .js JavaScript project directory file for your . Note that it must have the .js extension PostgreSQL export CSV Node.

  • Complete the steps based on the prompts to start your application. Use npm init to start the process for initializing it.

Navigate to your project directory (or use mkdir to create a new one), and then create a new JavaScript file (with the .js file extension) for your CSV Node.js application. Make sure to run npm init and complete the subsequent text prompts if you haven’t already in order to initialize your application.

Edit the JavaScript file for the PostgreSQL Export CSV with Node.js app

  • For you Visual Studio (VS) code users, input the export command below to make a new file or edit one.
code export-csv.js
  • A different way to do that is to input subl, an

You can also use Sublime Text with its respective subl command:

subl export-csv.js

Require the Fast CSV Node library to work with CSV data

  • Your application in PostgreSQL needs Node library modules. Require them in JavaScript like this:
const fastcsv = require("fast-csv");
const fs = require("fs");

Create a write stream for the CSV file using Node’s file system library

  • Construct a CSV write steam that has the file system Node has available. Use the function require() to do it:
const fs = require("fs");
const ws = fs.createWriteStream("sample_table.csv");

NOTE: The PostgreSQL database table information defaults to the directory of the application. If you want to export the CSV file you’re creating to a different folder, take care to define the absolute path of where the folder is located.

Create a connection ‘pool’ to the PostgreSQL database

  • Add the library pg to your file in Node. Use the require() function to creates the library Pool method instance with this code here:
const Pool = require("pg").Pool;
  • Next, make a new pool PostgreSQL server connection in Node with the library method instance.

Connect to Postgres in a Node script using the Pool() method

  • To make a database connection, the method Pool() needs to receive a few server parameters. Use this code to accomplish that:
const client = new Pool({
  host: "localhost",
  user: "postgres",
  database: "sample_db",
  password: "1234",
  port: 5432
});

Declare a string for the PostgreSQL table name

  • Another way to connect is to make a declaration for the table name constant, and then pass the statement afterward.
// declare constant for the table name
const tableName = "sample_table";

Connect to PostgreSQL and declare a SQL statement string

  • Here’s code for making a PostgreSQL database server connection within the application in Node.js. Exceptions are automatically taken care of if they come up.
// attempt to connect using the Pool
client.connect((err, client, done) => {

// error handling for the client instance connection
if (err) throw err;

// SQL string that selects all records from a table
const sqlQuery = `SELECT * FROM ${tableName}`

Use the Fast-CSV Node library to pipe the Postgres records to the CSV write stream

  • This code below makes a SQL string pass with SELECT. The string is sent to the method query() which is of the library pg. After that is finished, the data is written to the CSV file using the method fastcsv.write().
// pass SQL string and table name to query()
client.query(sqlQuery, (err, res) => {

if (err) {
console.log("client.query()", err.stack)
}

if (res) {

const jsonData = JSON.parse(JSON.stringify(res.rows));
console.log("\njsonData:", jsonData)

fastcsv
// write the JSON data as a CSV file
.write(jsonData, { headers: true })

// log message when finished
.on("finish", function() {
console.log(`Postgres table ${tableName} exported to CSV file successfully.`);
})

NOTE: It’s important that the JSON object {headers: true} is passed to the method call. This way, you’ll be sure the header for the CSV file matches column names of the PostgreSQL table.

Pipe the Postgres data to the write stream

  • Type this code to call the method pipe() of the Fast-CSV library. You’ll see a message that it data has been piped when it’s finished.
// pipe the CSV data through the writestream
.pipe(ws)
}

// callback for finishing the CSV creation
done(console.log('Creating CSV from client.query() data'))
})
})

Connect to PostgreSQL and export CSV in Node by executing the script

  • Now, use the command node to start the application. After it completes the records iteration, a CSV file will be constructed automatically for the PostgreSQL export CSV node. It shouldn’t take long to finish; however, if your table has many records, it might take a few extra minutes.

  • You should see a result similar to this one for your Node.js app:

jsonData [
{
id: 1,
firstname: 'Golda',
lastname: 'Kaniecki',
age: 25,
email: 'golda_kaniecki@yahoo.com'
},
{
id: 2,
firstname: 'Catarina',
lastname: 'Gleich',
age: 43,
email: 'catarina_gleich@hotmail.com'
}
]
Creating CSV from client.query() data
Postgres table sample_table exported to CSV file successfully.

Conclusion

Confirm that your CSV file was made by opening it. The display of your table’s data is a revelation that your PostgreSQL export CSV node was a success.

CHANGE

This concludes this tutorial where you learned how to PostgreSQL export CSV node. CSV table data is one of many formats that you as a developer deal with on a regular basis. It’s good to know of a simple process to export PostgreSQL table data into a CSV-compliant file to meet your application needs.

Just the code

Here’s the entire sample code for your PostgreSQL export CSV node reference.

const Pool = require("pg").Pool;
const fastcsv = require("fast-csv");

// create a write stream
const fs = require("fs");
const ws = fs.createWriteStream("sample_table.csv");

// declare a new client instance from Pool()
const client = new Pool({
host: "localhost",
user: "postgres",
database: "sample_db",
password: "1234",
port: 5432
});

// declare constant for the table name
const tableName = "sample_table"

// attempt to connect using the Pool
client.connect((err, client, done) => {

// error handling for the client instance connection
if (err) throw err;

// SQL string that selects all records from a table
const sqlQuery = `SELECT * FROM ${tableName}`

// pass SQL string and table name to query()
client.query(sqlQuery, (err, res) => {

if (err) {
console.log("client.query()", err.stack)
}

if (res) {

const jsonData = JSON.parse(JSON.stringify(res.rows));
console.log("\njsonData:", jsonData)

fastcsv
// write the JSON data as a CSV file
.write(jsonData, { headers: true })

// log message when finished
.on("finish", function() {
console.log(`Postgres table ${tableName} exported to CSV file successfully.`);
})

Pilot the ObjectRocket Platform Free!

Try Fully-Managed CockroachDB, Elasticsearch, MongoDB, PostgreSQL (Beta) or Redis.

Get Started

Keep in the know!

Subscribe to our emails and we’ll let you know what’s going on at ObjectRocket. We hate spam and make it easy to unsubscribe.