efficiently trimming postgresql tables
Posted
by agilefall
on Stack Overflow
See other posts from Stack Overflow
or by agilefall
Published on 2010-04-19T23:16:23Z
Indexed on
2010/04/19
23:23 UTC
Read the original article
Hit count: 214
I have about 10 tables with over 2 million records and one with 30 million. I would like to efficiently remove older data from each of these tables.
My general algorithm is:
- create a temp table for each large table and populate it with newer data
- truncate the original tables
- copy tmp data back to original tables using: "insert into originaltable (select * from tmp_table)"
However, the last step of copying the data back is taking longer than I'd like. I thought about deleting the original tables and making the temp tables "permanent", but I lose constraint/foreign key info.
If I delete from the tables directly, it takes much longer. Given that I need to preserve all foreign keys and constraints, are there any faster ways of removing the older data?
Thanks.
© Stack Overflow or respective owner