Concurrent usage of table causing issues
Posted
by Sven
on Stack Overflow
See other posts from Stack Overflow
or by Sven
Published on 2010-04-08T08:30:23Z
Indexed on
2010/04/08
8:33 UTC
Read the original article
Hit count: 185
sql-server
Hello
In our current project we are interfacing with a third party data provider. They need to insert data in a table of ours. This inserting can be frequent every 1 min, every 5min, every 30, depends on the amount of new data they need to provide. The use the isolation level read committed. On our end we have an application, windows service, that calls a webservice every 2 minutes to see if there is new data in this table. Our isolation level is repeatable read. We retrieve the records and update a column on these rows.
Now the problem is that sometimes this third party provider needs to insert a lot of data, let's say 5000 records. They do this per transaction (5rows per transaction), but they don't close the connection. They do one transaction and then the next untill all records are inserted. This caused issues for our process, we receive a timeout.
If this goes on for a long time the database get's completely unstable. For instance, they maybe stopped, but the table somehow still stays unavailable. When I try to do a select on the table, I get several records but at a certain moment I don't get any response anymore. It just says retrieving data but nothing comes anymore until I get a timeout exception.
Only solution is to restart the database and then I see the other records.
How can we solve this. What is the ideal isolation level setting in this scenario?
© Stack Overflow or respective owner