Why are there performance differences when a SQL function is called from .Net app vs when the same c

Posted by Dan Snell on Stack Overflow See other posts from Stack Overflow or by Dan Snell
Published on 2010-05-07T00:05:04Z Indexed on 2010/05/07 0:08 UTC
Read the original article Hit count: 170

Filed under:
|
|

We are having a problem in our test and dev environments with a function that runs quite slowly at times when called from an .Net Application. When we call this function directly from management studio it works fine.

Here are the differences when they are profiled: From the Application:
CPU: 906
Reads: 61853
Writes: 0
Duration: 926

From SSMS:
CPU: 15
Reads: 11243
Writes: 0
Duration: 31

Now we have determined that when we recompile the function the performance returns to what we are expecting and the performance profile when run from the application matches that of what we get when we run it from SSMS. It will start slowing down again at what appear to random intervals.

We have not seen this in prod but they may be in part because everything is recompiled there on a weekly basis.

So what might cause this sort of behavior?

© Stack Overflow or respective owner

Related posts about sql

Related posts about Performance