Compressing large text data before storing into db?

Posted by Steel Plume on Stack Overflow See other posts from Stack Overflow or by Steel Plume
Published on 2010-03-18T21:22:44Z Indexed on 2010/03/18 21:31 UTC
Read the original article Hit count: 300

Filed under:
|
|
|

Hello, I have application which retrieves many large log files from a system LAN.

Currently I put all log files on Postgresql, the table has a column type TEXT and I don't plan any search on this text column because I use another external process which nightly retrieves all files and scans for sensitive pattern.

So the column value could be also a BLOB or a CLOB, but now my question is the following, the database has already its compression system, but could I improve this compression manually like with common compressor utilities? And above all WHAT IF I manually pre-compress the large file and then I put as binary into the data table, is it unuseful as database system provides its internal compression?

© Stack Overflow or respective owner

Related posts about postgresql

Related posts about database