Need advice on comparing the performance of 2 equivalent linq to sql queries

Posted by uvita on Stack Overflow See other posts from Stack Overflow or by uvita
Published on 2010-03-14T21:35:59Z Indexed on 2010/03/14 21:45 UTC
Read the original article Hit count: 313

I am working on tool to optimize linq to sql queries. Basically it intercepts the linq execution pipeline and makes some optimizations like for example removing a redundant join from a query. Of course, there is an overhead in the execution time before the query gets executed in the dbms, but then, the query should be processed faster. I don't want to use a sql profiler because I know that the generated query will be perform better in the dbms than the original one, I am looking for a correct way of measuring the global time between the creation of the query in linq and the end of its execution. Currently, I am using the Stopwatch class and my code looks something like this:

var sw = new Stopwatch();
sw.Start();
const int amount = 100;
for (var i = 0; i < amount; i++)
{
   ExecuteNonOptimizedQuery();
}
sw.Stop();
Console.Writeline("Executing the query {2} times took: {0}ms. On average, each query took: {1}ms", sw.ElapsedMilliseconds, sw.ElapsedMilliseconds / amount, amount);

Basically the ExecutenNonOptimizedQuery() method creates a new DataContext, creates a query and then iterates over the results.

I did this for both versions of the query, the normal one and the optimized one. I took the idea from this post from Frans Bouma.

Is there any other approach/considerations I should take?

Thanks in advance!

© Stack Overflow or respective owner

Related posts about linq-to-sql

Related posts about performance-comparison