Performance of stored proc when updating columns selectively based on parameters?
Posted
by kprobst
on Stack Overflow
See other posts from Stack Overflow
or by kprobst
Published on 2010-06-15T19:19:20Z
Indexed on
2010/06/15
19:22 UTC
Read the original article
Hit count: 195
I'm trying to figure out if this is relatively well-performing T-SQL (this is SQL Server 2008). I need to create a stored procedure that updates a table. The proc accepts as many parameters as there are columns in the table, and with the exception of the PK column, they all default to NULL. The body of the procedure looks like this:
CREATE PROCEDURE proc_repo_update
@object_id bigint
,@object_name varchar(50) = NULL
,@object_type char(2) = NULL
,@object_weight int = NULL
,@owner_id int = NULL
-- ...etc
AS
BEGIN
update
object_repo
set
object_name = ISNULL(@object_name, object_name)
,object_type = ISNULL(@object_type, object_type)
,object_weight = ISNULL(@object_weight, object_weight)
,owner_id = ISNULL(@owner_id, owner_id)
-- ...etc
where
object_id = @object_id
return @@ROWCOUNT
END
So basically:
Update a column only if its corresponding parameter was provided, and leave the rest alone.
This works well enough, but as the ISNULL
call will return the value of the column if the received parameter was null, will SQL Server optimize this somehow? This might be a performance bottleneck on the application where the table might be updated heavily (insertion will be uncommon so the performance there is not a problem). So I'm trying to figure out what's the best way to do this. Is there a way to condition the column expressions with something like CASE WHEN
or something? The table will be indexed up the wazoo as well for read performance. Is this the best approach? My alternative at this point is to create the UPDATE
expression in code (e.g. inline SQL) and execute it against the server. This would solve my doubts about performance, but I'd rather leave this in a stored proc if possible.
© Stack Overflow or respective owner