Memory Usage for Databases on Linux
- by Kyle Brandt
So with free output what we care about with application memory usage is generally the amount of free memory in the -/+ buffers/cache line. What about with database applications such as Oracle, is it important to have a good amount of cached and buffers available for a database to run well with all the IO?
If that makes any sense, how do you figure out just how much?