Compressing large text data before storing into db?
- by Steel Plume
Hello,
I have application which retrieves many large log files from a system LAN.
Currently I put all log files on Postgresql, the table has a column type TEXT and I don't plan any search on this text column because I use another external process which nightly retrieves all files and scans for sensitive pattern.
So the column value could be also a BLOB or a CLOB, but now my question is the following,
the database has already its compression system, but could I improve this compression manually like with common compressor utilities? And above all WHAT IF I manually pre-compress the large file and then I put as binary into the data table, is it unuseful as database system provides its internal compression?