What's the best way to view/analyse/filter huge traces/logfiles?

Posted by oliver on Stack Overflow See other posts from Stack Overflow or by oliver
Published on 2010-03-18T15:04:56Z Indexed on 2010/03/18 15:11 UTC
Read the original article Hit count: 379

Filed under:
|

this seems to be a reoccurring issue: we receive a bug report for our software and with it tons of traces or logfiles.
since finding errors is much easier when having a visualization of the log messages/events over time it is convenient to use a tool that can display the progression of events in a graph etc. (e.g. wireshark (http://www.wireshark.org) for analyzing network traffic)

what tool do you use for such a purpose?

the problem with most tools i used so far is that they mercilessly break down when you feed them huge data traces (> 1GB) so some criteria for such a tool would be:

  • can deal with huge input files (> 1 GB)
  • is really fast (so you don't have to get coffee while a file is loading)
  • has some sort of filtering mechanism

© Stack Overflow or respective owner

Related posts about testing

Related posts about analysis