Improve Log Exceptions

Posted by Jaider on Stack Overflow See other posts from Stack Overflow or by Jaider
Published on 2012-10-19T15:59:50Z Indexed on 2012/10/19 23:02 UTC
Read the original article Hit count: 266

Filed under:
|
|
|

I am planning to use log4net in a new web project. In my experience, I see how big the log table can get, also I notice that errors or exceptions are repeated. For instance, I just query a log table that have more than 132.000 records, and I using distinct and found that only 2.500 records are unique (~2%), the others (~98%) are just duplicates. so, I came up with this idea to improve logging.

Having a couple of new columns: counter and updated_dt, that are updated every time try to insert same record.

If want to track the user that cause the exception, need to create a user_log or log_user table, to map N-N relationship.

Create this model may made the system slow and inefficient trying to compare all these long text... Here the trick, we should also has a hash column of binary of 16 or 32, that hash the message and the exception, and configure an index on it. We can use HASHBYTES to help us. I am not an expert in DB, but I think that will made the faster way to locate a similar record. And because hashing doesn't guarantee uniqueness, will help to locale those similar record much faster and later compare by message or exception directly to make sure that are unique.

This is a theoretical/practical solution, but will it work or bring more complexity? what aspects I am leaving out or what other considerations need to have? the trigger will do the job of insert or update, but is the trigger the best way to do it?

© Stack Overflow or respective owner

Related posts about sql

Related posts about sql-server