What's the best way to transfer a large dataset over a .NET web service?

Posted by Malvineous on Stack Overflow See other posts from Stack Overflow or by Malvineous
Published on 2010-06-16T01:27:24Z Indexed on 2010/06/16 1:32 UTC
Read the original article Hit count: 258

I've inherited a C# .NET application which talks to a web service, and the web service talks to an Oracle database. I need to add an export function to the UI, to produce an Excel spreadsheet of some of the data.

I have created a web service function to run a database query, load the data into a DataTable and then return it, which works fine for a small number of rows. However there is enough data in the full run that the client application locks up for a few minutes and then returns a timeout error. Obviously this isn't the best way to retrieve such a large dataset.

Before I go ahead and come up with some dodgy way of splitting the call, I'm wondering if there is already something in place that can handle this. At the moment I'm thinking of a startExport function then repeatedly calling a next50Rows function until there is no data left, but because web services are stateless this means I'm going to have to keep some sort of ID number around and deal with the associated permissions. It would mean that I don't have to load the entire data set into the web server's memory though, which is one good thing.

So if anyone knows a better way to retrieve a large amount of data (in a table format) over a .NET web service, please let me know!

© Stack Overflow or respective owner

Related posts about .NET

Related posts about webservice