Loading a big database dump into PostgreSQL using cat
Posted
by
RussH
on Server Fault
See other posts from Server Fault
or by RussH
Published on 2014-06-02T19:28:03Z
Indexed on
2014/06/02
21:30 UTC
Read the original article
Hit count: 287
I have a pair of very large (~17 GB) database dumps that I want to load into postgresql 9.3. After installing the database packages, learning more or less how to use them, and fiddling around a little on various StackExchange pages (particularly this question), it looks like a proper command for me to use is something like:
cat mydb.pgdump | psql mydb
because of the format the dump is in. My machine has 16 GB of RAM, and I'm not familiar with the cat
command but I do know that my RAM is 99% exhausted and the database is taking a while to load. My machine isn't non-responsive to the point of hanging; I can run other commands in other terminal windows and have them execute at a reasonable clip, but I am wondering if cat
is the best way to pipe in the file or if something else is more efficient? My concern is that maybe cat
could be using up all the RAM so the database doesn't have much to work with, throttling its performance. But I'm new to thinking about RAM issues like this and don't know if I'm worrying about nothing.
Now that I think about it, this seems to be more of a question about cat
and its memory usage than anything else. If there is a more appropriate forum for this question please let me know. Thanks!
© Server Fault or respective owner