Dump to CSV/Postgres memory

Posted by alex on Stack Overflow See other posts from Stack Overflow or by alex
Published on 2010-05-04T20:22:29Z Indexed on 2010/05/04 21:08 UTC
Read the original article Hit count: 228

Filed under:
|
|
|

I have a large table (300 million lines) that I would like to dump to a csv - I need to do some processing that cannot be done with SQL. Right now I am using Squirrel as a client, and it does not apparently deal very well with large datasets - at least as far as I can tell from my own (limited) experience. If I run the query on the actual host, will it use less memory? Thanks for any help.

© Stack Overflow or respective owner

Related posts about postgresql

Related posts about sql