Save PL/pgSQL output from PostgreSQL to a CSV file
Solution 1
Do you want the resulting file on the server, or on the client?
Server side
If you want something easy to re-use or automate, you can use Postgresql's built in COPY command. e.g.
Copy (Select * From foo) To '/tmp/test.csv' With CSV DELIMITER ',' HEADER;
This approach runs entirely on the remote server - it can't write to your local PC. It also needs to be run as a Postgres "superuser" (normally called "root") because Postgres can't stop it doing nasty things with that machine's local filesystem.
That doesn't actually mean you have to be connected as a superuser (automating that would be a security risk of a different kind), because you can use the SECURITY DEFINER
option to CREATE FUNCTION
to make a function which runs as though you were a superuser.
The crucial part is that your function is there to perform additional checks, not just by-pass the security - so you could write a function which exports the exact data you need, or you could write something which can accept various options as long as they meet a strict whitelist. You need to check two things:
- Which files should the user be allowed to read/write on disk? This might be a particular directory, for instance, and the filename might have to have a suitable prefix or extension.
- Which tables should the user be able to read/write in the database? This would normally be defined by
GRANT
s in the database, but the function is now running as a superuser, so tables which would normally be "out of bounds" will be fully accessible. You probably don’t want to let someone invoke your function and add rows on the end of your “users” table…
I've written a blog post expanding on this approach, including some examples of functions that export (or import) files and tables meeting strict conditions.
Client side
The other approach is to do the file handling on the client side, i.e. in your application or script. The Postgres server doesn't need to know what file you're copying to, it just spits out the data and the client puts it somewhere.
The underlying syntax for this is the COPY TO STDOUT
command, and graphical tools like pgAdmin will wrap it for you in a nice dialog.
The psql
command-line client has a special "meta-command" called \copy
, which takes all the same options as the "real" COPY
, but is run inside the client:
\copy (Select * From foo) To '/tmp/test.csv' With CSV DELIMITER ',' HEADER
Note that there is no terminating ;
, because meta-commands are terminated by newline, unlike SQL commands.
From the docs:
Do not confuse COPY with the psql instruction \copy. \copy invokes COPY FROM STDIN or COPY TO STDOUT, and then fetches/stores the data in a file accessible to the psql client. Thus, file accessibility and access rights depend on the client rather than the server when \copy is used.
Your application programming language may also have support for pushing or fetching the data, but you cannot generally use COPY FROM STDIN
/TO STDOUT
within a standard SQL statement, because there is no way of connecting the input/output stream. PHP's PostgreSQL handler (not PDO) includes very basic pg_copy_from
and pg_copy_to
functions which copy to/from a PHP array, which may not be efficient for large data sets.
Solution 2
There are several solutions:
1 psql
command
psql -d dbname -t -A -F"," -c "select * from users" > output.csv
This has the big advantage that you can using it via SSH, like ssh postgres@host command
- enabling you to get
2 postgres copy
command
COPY (SELECT * from users) To '/tmp/output.csv' With CSV;
3 psql interactive (or not)
>psql dbname
psql>\f ','
psql>\a
psql>\o '/tmp/output.csv'
psql>SELECT * from users;
psql>\q
All of them can be used in scripts, but I prefer #1.
4 pgadmin but that's not scriptable.
Solution 3
In terminal (while connected to the db) set output to the cvs file
1) Set field seperator to ','
:
\f ','
2) Set output format unaligned:
\a
3) Show only tuples:
\t
4) Set output:
\o '/tmp/yourOutputFile.csv'
5) Execute your query:
:select * from YOUR_TABLE
6) Output:
\o
You will then be able to find your csv file in this location:
cd /tmp
Copy it using the scp
command or edit using nano:
nano /tmp/yourOutputFile.csv
Solution 4
CSV Export Unification
This information isn't really well represented. As this is the second time I've needed to derive this, I'll put this here to remind myself if nothing else.
Really the best way to do this (get CSV out of postgres) is to use the COPY ... TO STDOUT
command. Though you don't want to do it the way shown in the answers here. The correct way to use the command is:
COPY (select id, name from groups) TO STDOUT WITH CSV HEADER
Remember just one command!
It's great for use over ssh:
$ ssh psqlserver.example.com 'psql -d mydb "COPY (select id, name from groups) TO STDOUT WITH CSV HEADER"' > groups.csv
It's great for use inside docker over ssh:
$ ssh pgserver.example.com 'docker exec -tu postgres postgres psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv
It's even great on the local machine:
$ psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv
Or inside docker on the local machine?:
docker exec -tu postgres postgres psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv
Or on a kubernetes cluster, in docker, over HTTPS??:
kubectl exec -t postgres-2592991581-ws2td 'psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv
So versatile, much commas!
Do you even?
Yes I did, here are my notes:
The COPYses
Using /copy
effectively executes file operations on whatever system the psql
command is running on, as the user who is executing it1. If you connect to a remote server, it's simple to copy data files on the system executing psql
to/from the remote server.
COPY
executes file operations on the server as the backend process user account (default postgres
), file paths and permissions are checked and applied accordingly. If using TO STDOUT
then file permissions checks are bypassed.
Both of these options require subsequent file movement if psql
is not executing on the system where you want the resultant CSV to ultimately reside. This is the most likely case, in my experience, when you mostly work with remote servers.
It is more complex to configure something like a TCP/IP tunnel over ssh to a remote system for simple CSV output, but for other output formats (binary) it may be better to /copy
over a tunneled connection, executing a local psql
. In a similar vein, for large imports, moving the source file to the server and using COPY
is probably the highest-performance option.
PSQL Parameters
With psql parameters you can format the output like CSV but there are downsides like having to remember to disable the pager and not getting headers:
$ psql -P pager=off -d mydb -t -A -F',' -c 'select * from groups;'
2,Technician,Test 2,,,t,,0,,
3,Truck,1,2017-10-02,,t,,0,,
4,Truck,2,2017-10-02,,t,,0,,
Other Tools
No, I just want to get CSV out of my server without compiling and/or installing a tool.
Solution 5
New version - psql 12 - will support --csv
.
--csv
Switches to CSV (Comma-Separated Values) output mode. This is equivalent to \pset format csv.
csv_fieldsep
Specifies the field separator to be used in CSV output format. If the separator character appears in a field's value, that field is output within double quotes, following standard CSV rules. The default is a comma.
Usage:
psql -c "SELECT * FROM pg_catalog.pg_tables" --csv postgres
psql -c "SELECT * FROM pg_catalog.pg_tables" --csv -P csv_fieldsep='^' postgres
psql -c "SELECT * FROM pg_catalog.pg_tables" --csv postgres > output.csv
Hoff
Updated on January 11, 2022Comments
-
Hoff over 2 years
What is the easiest way to save PL/pgSQL output from a PostgreSQL database to a CSV file?
I'm using PostgreSQL 8.4 with pgAdmin III and PSQL plugin where I run queries from.
-
Peter Krauss about 9 yearsSee also stackoverflow.com/q/1120109/287948
-
-
Daechir over 14 yearsThis isn't a true CSV file--watch it burn if there are commas in the data--so using the built-in COPY support is preferred. But this general technique is handy as a quick hack for exporting from Postgres in other delimited formats besides CSV.
-
Mike Neumegen over 12 yearsIMSoP's answer didn't work for me as I needed to be a super admin. This worked a treat. Thanks!
-
Drachenfels about 12 yearsObviously above example requires sometimes user to be a superuser, here's a version for ordinary people ;) echo “COPY (SELECT * from foo) TO STDOUT with CSV HEADER” | psql -o '/tmp/test.csv' database_name
-
metdos almost 12 yearsand \o in order to print console again
-
Ruslan Kabalin over 11 yearsThis will not produce a CSV file, it will just record the command output to the text file (which does not make it the comma-separated).
-
Marcin Wasiluk over 11 years@RuslanKabalin yes I have just notticed that and ammended instruction to create comma-separated output (cvs)
-
krlmlr about 11 years@Drachenfels:
\copy
works, too -- there, the paths are relative to the client, and no semicolon is needed/allowed. See my edit. -
Piohen about 11 yearsIMHO the first option is error prone, because it doesn't include proper escaping of comma in exported data.
-
sorin about 11 years@Piohen as far as I remember it does because it will quote strings, but I'm not 100% sure, better to test.
-
jO. over 10 years@IMSoP: How would you add a COPY statement to an sql (on postgres 9.3) function? So the query gets saved to a .csv file?
-
Devy over 10 yearsIf the query is custom (I.E. having column aliases or joining different tables), the header will print out the column aliases just as it display on the screen.
-
isaaclw over 10 yearsIt looks like
\copy
needs to be a one-liner. So you don't get the beauty of formatting the sql the way you want, and just putting a copy/function around it. -
Danny Armstrong over 10 yearsI'd improve this answer by noting that the "csv" output will not be properly escaped and each time a sql command is executed the results are concatenated to the output file.
-
user411279 about 10 yearsIt seems more the exception than the rule that you can run your command on the server. For this reason, I think sorin's answer is much better.
-
IMSoP about 10 years@user411279 The big advantage of a properly secured use of
COPY FROM
on the server is that you can build it into a script or web application in your language of choice without having to rely on a separate shell command, which will need a DB password etc to connect. -
IMSoP about 10 years@user411279 Also, the
\copy
command in the second half of my answer is equivalent to the> output.csv
in the first part of sorin's, and theCOPY
command in the second part of sorin's answer is the one I discuss in detail in the first half of mine. -
Cerin about 10 yearsUnfortunately, the first method doesn't include column headers, which makes it fairly useless for most applications...
-
Cerin about 10 yearsAlso, psql doesn't quote cell values, so if ANY of your data uses the delimiter, your file will be corrupted.
-
ic3b3rg almost 10 years@Cerin -t is a synonym for --tuples-only (turn off printing of column names and result row count footers, etc.) - omit it to get column headers
-
MrColes over 9 yearsJust tested the comma-escaping claim—it’s true, method #1 does not escape commas in values.
-
MrValdez almost 9 yearsI fixed #3 so it'll output as a CSV file. It still won't escape commas though.
-
Samuel Dauzon over 8 yearsThanks for 3 solution. It doesn't need an superuser account.
-
Bogdan over 8 yearsAlso, note that if you want to automate SSH interactions with the "psql" utility, there is no way to specify the password from the command prompt. Best way to deal with it is to create a password file as described here: postgresql.org/docs/9.4/static/libpq-pgpass.html
-
ady over 8 yearsThe COPY command is the easiest and the most efficient method to extract data from PostgreSQL in CSV format (it aslo provides the binary format which is slightly more efficient than CSV one).
-
verisimilitude almost 8 yearsI keep on forgetting the syntax of this command and keep on visiting your answer!
-
Massimo over 7 yearsI have a complex SELECT in an external file. Any suggestion for the COPY syntax ?
-
Wildcard over 7 yearsWhat about newlines in field values? The
COPY
or\copy
approaches handle correctly (convert to standard CSV format); does this? -
CodingInCircles over 7 yearsFor method 1, I saved it as a tsv file and gave "}" as the field separator. Of course, if your data has
}
, this is a terrible idea. But it worked for me. I opened it in Excel, split them and it was fine. Not really advisable as a long term solution, but for one-offs it works fine. -
Dion Truter over 7 yearsFor complex queries it is useful to just run a script file: psql -d dbname -t -A -F"," -f my-report.sql > my-report.csv
-
GGO about 6 yearsPlease expolain what you did editing answer, avoid code only answer
-
Toby Speight about 6 yearsThank you for this code snippet, which might provide some limited short-term help. A proper explanation would greatly improve its long-term value by showing why this is a good solution to the problem, and would make it more useful to future readers with other, similar questions. Please edit your answer to add some explanation, including the assumptions you've made.
-
nvoigt about 6 yearsThis will produce a json file, not a csv file.
-
kRazzy R about 6 yearsWhere do the results get saved to ? My query runs but the file doesn't show up anywhere on my computer. This is what I'm doing : COPY (select a,b from c where d = '1') TO STDOUT WITH CSVHEADER > abcd.csv
-
joshperry about 6 years@kRazzyR The output goes to stdout of the psql command, so ultimately whatever you do with stdout is where the data goes. In my examples I use '> file.csv' to redirect to a file. You want to make sure that is outside the command being sent to to the server through the psql -c parameter. See the 'local machine' example.
-
Andre Silva about 6 yearsI was not able to run
\copy
from pgAdmin? Do you know how to do it (it seems it was not recognizing the program)? I succeeded using it from the SQL Shell (psql) though. -
IMSoP about 6 years@AndreSilva As the answer states,
\copy
is a special meta-command in thepsql
command-line client. It won't work in other clients, like pgAdmin; they will probably have their own tools, such as graphical wizards, for doing this job. -
techbrownbags about 6 yearsalso use "\pset footer" so the row counts don't hoe up in the file
-
Abel Callejo almost 6 yearshow can this include the column names?
-
CFL_Jeff almost 6 yearsFWIW, I had to use
-F","
instead of-F";"
to generate a CSV file that would open correctly in MS Excel -
EliadL over 4 years@MarcinWasiluk please see this comment. You can replace
\t
with\pset footer
to achieve the same goal, with the added value of retaining the header. I tried to edit your answer myself but it was rejected. -
EliadL over 4 years@sorin please see this comment. You can replace
-t
with-P footer
(and add\pset footer
interactively) to achieve the same goal, with the added value of retaining the header. I tried to edit your answer myself but it was rejected. -
Rich Lysakowski PhD over 4 yearsThanks for the complete explanation. The copy command is hopelessly complex with psql. I end up usually using a free database client (dbeaver community edition) to import and export data files. It provides nice mapping and formatting tools. Your answer provides great detailed examples for copying from remote systems.
-
Lightheaded almost 4 yearsGreat, thanks! I've used ` psql -h dblocation -p port -U user -d dbname -F $',' --no-align -c "SELECT * FROM TABLE" > outfile.csv` to get CSVs. There's no quoting the fields, but it serves well enough for my purposes
-
Simon Puente almost 4 yearsHow can I use
\copy
orCOPY
with\i path/to/file.sql
? -
harryghgim over 3 yearsThis is an amazing solution. Thanks a lot.
-
Somto over 3 yearsI had the permission denied error as well. Fixed it by sending to the
/tmp
folder first. For example:\copy (SELECT * FROM messages) TO '/tmp/messages.csv' With CSV HEADER;
-
tospo almost 3 yearsI think you should remove the first solution. It is far too error prone. It looks like it works well until you realise that it will never quote (unlike solution 2) and hence you can easily create a large file that looks fine at first but is full of messed-up lines due to comma in a cell value
-
combinatorist almost 3 yearsFYI, you can configure
.pg_service.conf
to alias the connection params to likepsql service=default -F $'\t' ...
. -
Dek4nice over 2 years1 psql command could be approach: instead of query
select * from users
use per varchar-columns quote function: quote_literal() :select id, quote_literal(name), quote_literal(email), ... from users
-
Himanshu over 2 yearsRedshift supports
UNLOAD