PostgreSQL Export CSV with Node
Introduction
Scalability describes frameworks built with NodeJS. When you integrate a PostgreSQL database, you’ve got a powerhouse combination to meet your developer-creating needs. Even so, you also require the ability to export various types of documents as different file types. As you know, CSV is one of the most common formats regularly utilized. Learn how to export effortlessly with this tutorial that explains how to perform a PostgreSQL export CSV node.
If you’re already familiar with the steps of exporting CSV with Node and want to bypass these steps, go to Just the Code.
Prerequisites to using PostgreSQL and Node.JS
Verify that you’ve installed PostgresSQL.
For Ubuntu and its distros, and Linux, the
service postgresql status
command will confirm anactive
status. Therefore, enterservice postgresql status
at the command line. After you see the status indication, exit by pressing CTRL+C.Check for the installation of the command line
psql -V
interactive. Alternatively, inputps aux | grep postgres
for macOS. If you have Linux, that command will work for it as well. The display will show all PostgreSQL processes that are active.If you don’t have Node.js installed, do so now.
Check that Node and its ‘npm’ package manager are installed
At the command line, type
node -v
to identify the version of Node.js on your OS.Next, type
npm -v
to confirm that the Node Package Manager is installed and running.
Use the Node Package Manager to install the ‘pg’ Postgres packages
- Add the client
pg
to your Node.js dependencies.
1 | npm i pg |
Next, you’ll need to get the required CSV modules for the table data in PostgreSQL exporting via Node.js. The library modulesjson2csv
, fast-csv
and csv-writer
.
Install the Fast CSV Node library for accessing CSV data in Node.js
Parsing data is the role of thefast-csv
module. It does this by delimiting data values so the format becomes CSV-compliant.
- At a terminal window, install
fast-csv
with either the commandnpm
ornpm i
.
1 | npm i fast-csv |
- After a successful installation, you should see a response like this one:
1 2 3 | + fast-csv@3.4.0 added 4 packages from 50 contributors and audited 245 packages in 5.098s found 0 vulnerabilities |
Inserting some records into PostgreSQL table
Make a sample database with records in a table to test out the examples given in this tutorial for learning PostgreSQL export CSV node.
- To add records to a PostgreSQL
sample_table
, use the commandpsql
:
NOTE: If you’ve created a sample database and table with records in it, feel free to skip adding more records to your table.
1 2 3 | INSERT INTO sample_table(id, firstname, lastname, age, email) VALUES(1, 'Golda', 'Kaniecki', 25, 'golda_kaniecki@yahoo.com'), (2, 'Catarina', 'Gleich', 43, 'catarina_gleich@hotmail.com'); |
- Test your table’s data by using the statement
SELECT
.
1 | SELECT * FROM sample_table; |
- You should see a result similar to this one which shows records in
sample_table
:
1 2 3 4 5 | id | firstname | lastname | age | email ----+-----------+----------+-----+----------------------------- 1 | Golda | Kaniecki | 25 | golda_kaniecki@yahoo.com 2 | Catarina | Gleich | 43 | catarina_gleich@hotmail.com (2 ROWS) |
Create the PostgreSQL Export CSV Node application
Go to the project directory. If you don’t have one, make one with the
mkdir
command.Then, construct a
.js
JavaScript project directory file for your . Note that it must have the.js
extension PostgreSQL export CSV Node.Complete the steps based on the prompts to start your application. Use
npm init
to start the process for initializing it.
Navigate to your project directory (or use mkdir
to create a new one), and then create a new JavaScript file (with the .js
file extension) for your CSV Node.js application. Make sure to run npm init
and complete the subsequent text prompts if you haven’t already in order to initialize your application.
Edit the JavaScript file for the PostgreSQL Export CSV with Node.js app
- For you Visual Studio (VS) code users, input the export command below to make a new file or edit one.
1 | code export-csv.js |
- A different way to do that is to input
subl
, an
You can also use Sublime Text with its respective subl
command:
1 | subl export-csv.js |
Require the Fast CSV Node library to work with CSV data
- Your application in PostgreSQL needs Node library modules. Require them in JavaScript like this:
1 2 | const fastcsv = require("fast-csv"); const fs = require("fs"); |
Create a write stream for the CSV file using Node’s file system library
- Construct a CSV write steam that has the file system Node has available. Use the function
require()
to do it:
1 2 | const fs = require("fs"); const ws = fs.createWriteStream("sample_table.csv"); |
NOTE: The PostgreSQL database table information defaults to the directory of the application. If you want to export the CSV file you’re creating to a different folder, take care to define the absolute path of where the folder is located.
Create a connection ‘pool’ to the PostgreSQL database
- Add the library
pg
to your file in Node. Use therequire()
function to creates the libraryPool
method instance with this code here:
1 | const Pool = require("pg").Pool; |
- Next, make a new pool PostgreSQL server connection in Node with the library method instance.
Connect to Postgres in a Node script using the Pool() method
- To make a database connection, the method
Pool()
needs to receive a few server parameters. Use this code to accomplish that:
1 2 3 4 5 6 7 | const client = new Pool({ host: "localhost", user: "postgres", database: "sample_db", password: "1234", port: 5432 }); |
Declare a string for the PostgreSQL table name
- Another way to connect is to make a declaration for the table name constant, and then pass the statement afterward.
1 2 | // declare constant for the table name const tableName = "sample_table"; |
Connect to PostgreSQL and declare a SQL statement string
- Here’s code for making a PostgreSQL database server connection within the application in Node.js. Exceptions are automatically taken care of if they come up.
1 2 3 4 5 6 7 8 | // attempt to connect using the Pool client.connect((err, client, done) => { // error handling for the client instance connection if (err) throw err; // SQL string that selects all records from a table const sqlQuery = `SELECT * FROM ${tableName}` |
Use the Fast-CSV Node library to pipe the Postgres records to the CSV write stream
- This code below makes a SQL string pass with
SELECT
. The string is sent to the methodquery()
which is of the librarypg
. After that is finished, the data is written to the CSV file using the methodfastcsv.write()
.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 | // pass SQL string and table name to query() client.query(sqlQuery, (err, res) => { if (err) { console.log("client.query()", err.stack) } if (res) { const jsonData = JSON.parse(JSON.stringify(res.rows)); console.log("\njsonData:", jsonData) fastcsv // write the JSON data as a CSV file .write(jsonData, { headers: true }) // log message when finished .on("finish", function() { console.log(`Postgres table ${tableName} exported to CSV file successfully.`); }) |
NOTE: It’s important that the JSON object
{headers: true}
is passed to the method call. This way, you’ll be sure the header for the CSV file matches column names of the PostgreSQL table.
Pipe the Postgres data to the write stream
- Type this code to call the method
pipe()
of the Fast-CSV library. You’ll see a message that it data has been piped when it’s finished.
1 2 3 4 5 6 7 8 | // pipe the CSV data through the writestream .pipe(ws) } // callback for finishing the CSV creation done(console.log('Creating CSV from client.query() data')) }) }) |
Connect to PostgreSQL and export CSV in Node by executing the script
Now, use the command
node
to start the application. After it completes the records iteration, a CSV file will be constructed automatically for the PostgreSQL export CSV node. It shouldn’t take long to finish; however, if your table has many records, it might take a few extra minutes.You should see a result similar to this one for your Node.js app:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 | jsonData [ { id: 1, firstname: 'Golda', lastname: 'Kaniecki', age: 25, email: 'golda_kaniecki@yahoo.com' }, { id: 2, firstname: 'Catarina', lastname: 'Gleich', age: 43, email: 'catarina_gleich@hotmail.com' } ] Creating CSV from client.query() data Postgres table sample_table exported to CSV file successfully. |
Conclusion
Confirm that your CSV file was made by opening it. The display of your table’s data is a revelation that your PostgreSQL export CSV node was a success.
This concludes this tutorial where you learned how to PostgreSQL export CSV node. CSV table data is one of many formats that you as a developer deal with on a regular basis. It’s good to know of a simple process to export PostgreSQL table data into a CSV-compliant file to meet your application needs.
Just the code
Here’s the entire sample code for your PostgreSQL export CSV node reference.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 | const Pool = require("pg").Pool; const fastcsv = require("fast-csv"); // create a write stream const fs = require("fs"); const ws = fs.createWriteStream("sample_table.csv"); // declare a new client instance from Pool() const client = new Pool({ host: "localhost", user: "postgres", database: "sample_db", password: "1234", port: 5432 }); // declare constant for the table name const tableName = "sample_table" // attempt to connect using the Pool client.connect((err, client, done) => { // error handling for the client instance connection if (err) throw err; // SQL string that selects all records from a table const sqlQuery = `SELECT * FROM ${tableName}` // pass SQL string and table name to query() client.query(sqlQuery, (err, res) => { if (err) { console.log("client.query()", err.stack) } if (res) { const jsonData = JSON.parse(JSON.stringify(res.rows)); console.log("\njsonData:", jsonData) fastcsv // write the JSON data as a CSV file .write(jsonData, { headers: true }) // log message when finished .on("finish", function() { console.log(`Postgres table ${tableName} exported to CSV file successfully.`); }) |
Pilot the ObjectRocket Platform Free!
Try Fully-Managed CockroachDB, Elasticsearch, MongoDB, PostgreSQL (Beta) or Redis.
Get Started