cancel
Showing results for 
Search instead for 
Did you mean: 

Query Slow

Highlighted
Level 3 Adventurer

Query Slow

Hi ,

 

            A postgreSQL 9.6 table consists of 75 K records ,if we use to select  * form --- it takes more that 55 mins to complete the query.

Bu the table have only two columns the column data type is Jason format.Please advice this scenario.

Please suggest to tune the query .

 

Note.

There is no option to use where condition and index.

Proper statistics and optimizer.

*******************Any another way to tune

 

Tags (1)
3 REPLIES 3
Level 3 Adventurer

Re: Query Slow

Hi,

 

We have checked the json data types with postgresql 9.6, it gave proper output for for 75k with default configuration.

 

postgres=# select * from orders;
id | info | info2
-------+------------------------------------------------------------------+------------------------------------------------------------------
1 | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}} | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}}
2 | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}} | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}}
3 | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}} | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}}
4 | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}} | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}}
5 | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}} | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}}
6 | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}} | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}}
7 | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}} | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}}
8 | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}} | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}}
9 | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}} | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}}
10 | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}} | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}}
11 | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}} | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}}
12 | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}} | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}}
13 | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}} | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}}
14 | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}} | { "customer": "John Doe", "items": {"product": "Beer","qty": 6}}


We suspect that for your case data in json column is larger than our test case.

If where clause and indexes are not possible then you can try limit and offset in your query if possible.

If it's not possible then in order to tune the postgresql parameters can you share below info:

1. Hardware details such as OS, RAM and processors
2. Postgresql parameter details such as max_connections, shared_buffers, temp_buffer, work_mem, maitainance_work_mem, effective_cache_size, max_worker_processes.
So that we can advice you which parameters can be tuned.
3. explain analyze for the query.

 

NOTE: If you are using pg_admin for fetching the data then it will take more time to load the data. Hence try limit and offset in your query.

Level 3 Adventurer

Re: Query Slow

HI @ajinkya 

          Thanks for your time,i have shared the details of my side..please research and let me know..thanks.

 

*Hardware Details*

CPU: 8 cores (Intel(R) Xeon(R) CPU E5-2670 v2 @ 2.50GHz)

RAM: 30GB

OS: CentOS Linux release 7.2.1511 (Core)

*Parameters*

max_connections: 300

shared_buffers: 8GB

temp_buffer: 8MB

work_mem: 4MB

maintenence_work_mem: 64MB

 

*Explain Plan*

g***t=# explain (analyze,buffers) select * from hgappsst.webhook_response where app_id ='2822';

                                                        QUERY PLAN

 

---------------------------------------------------------------------------------------------------------------

-----------

 Seq Scan on webhook_response  (cost=0.00..15812.00 rows=76463 width=1367) (actual time=0.014..98.590 rows=7559

8 loops=1)

   Filter: (app_id = '2822'::bigint)

   Rows Removed by Filter: 2428

   Buffers: shared hit=14824 dirtied=162

 Planning time: 0.647 ms

 Execution time: 136.568 ms

(6 rows)

 

NOTE:- the table have already index on that column ..but index not using taking more distinct values..

we already force the index also..using enable_seqscan=off,but it take more time and cost..

EDB Team Member

Re: Query Slow

According to me the parameters are set appropriately at the initial level. Commenting without having a look on the server will be a bad idea.

 

Please go through the following link which will explain you how to investigate further.

Examining Index Usage

 

Updated Statistics which can be done with Analyze command will make sure the planner is making right plans to execute.

Also, make sure your table is not bloated execute vacuum on regular basis which helps clearing bloat and helps increasing performance.