performance - Very slow "update table" operation on Postgres -


i'm using postgres 9.2, reason i've problem slow updates on relatively small table (16k rows). table ddl:

create table my_categories (   id serial,   category_id integer,   is_done smallint default 0,   expected_number_of_flags integer default 0 not null,   number_of_flags integer default 0 not null,   is_active smallint default 0 not null,   constraint my_categories_pkey primary key(id) )  (oids = false);  create index my_categories_idx on my_categories using btree (category_id); 

and here stats update row:

explain analyze  update my_categories set expected_number_of_flags = expected_number_of_flags + 1  category_id = 96465;  update on my_categories  (cost=4.27..8.29 rows=1 width=26) (actual time=199746.281..199746.281 rows=0 loops=1)   ->  bitmap heap scan on my_categories  (cost=4.27..8.29 rows=1 width=26) (actual time=50.937..51.193 rows=1 loops=1)       recheck cond: (category_id = 96465)       ->  bitmap index scan on my_categories_idx  (cost=0.00..4.27 rows=1 width=0) (actual time=1.600..1.600 rows=6167 loops=1)             index cond: (category_id = 96465) total runtime: 199746.339 ms 

can explain what's going on , how can improve performance of update?

thanks!

you can try set fillfactor 70 or 60 default set 100 of this:

alter table my_categories set (fillfactor = 70); vacuum full my_categories; reindex table my_categories; 

and can try run update query.


Comments

Popular posts from this blog

java - Date formats difference between yyyy-MM-dd'T'HH:mm:ss and yyyy-MM-dd'T'HH:mm:ssXXX -

c# - Get rid of xmlns attribute when adding node to existing xml -