Analysing/Profiling queries on PostgreSQL

10,207

Solution 1

Use pg_stat_statements extension to get long running queries. then use select* from pg_stat_statements order by total_time/calls desc limit 10 to get ten longest. then use explain to see the plan...

Solution 2

My general approach is usually a mixture of approaches. This requires no extensions.

  1. set the log_min_duration_statement to catch long-running queries. https://dba.stackexchange.com/questions/62842/log-min-duration-statement-setting-is-ignored should get you started.

  2. Use profiling of client applications to see which queries they are spending their time on. Sometimes one has queries which take a small duration but are so frequently repeated to cause performance problems.

Of course then explain analyze can help. If you are looking inside plpgsql functions however, often you need to pull out the queries and run explain analyze on them directly.

Note: ALWAYS run explain analyze in a transaction that rolls back or a read-only transaction unless you know that it does not write to the database.

Share:
10,207
dwildiac
Author by

dwildiac

Updated on June 06, 2022

Comments

  • dwildiac
    dwildiac almost 2 years

    I've just inherited an old PostgreSQL installation and need to do some diagnostics to find out why this database is running slow. On MS SQL you would use a tool such as Profiler to see what queries are running and then see how their execution plan looks like.

    What tools, if any, exist for PostgreSQL that I can do this with? I would appreciate any help since I´m quite new with Postgres.

  • JonnyRaa
    JonnyRaa about 3 years
    after adding to your config and restarting you'll also need to run the command create extension pg_stat_statements