Errors with large data sources
Posted
by The Sheek Geek
on Stack Overflow
See other posts from Stack Overflow
or by The Sheek Geek
Published on 2010-05-20T20:09:52Z
Indexed on
2010/05/20
20:10 UTC
Read the original article
Hit count: 216
I'm doing some benchmarking on large data sources and binding/exporting data for reporting.
I started with using a data set, filling it with 100000 rows and then attempting to open a crystal report with the retrieved data. I noticed that the data set filled just fine (took about 779 milliseconds) however, when attempting to export the data to the report or even bind to a gridview the application would fail with an OutOfMemoryException.
Does anyone experienced this before or have an idea of how to get around it? It is very possible that clients will run reports for years worth of data and 100000 rows are not inconceivable.
The application and the benchmark code are written in C# using ORACLE and SQL Server databases. I still have some data sources to test, but would like to know how to get around this just in case I don't find a better solution.
© Stack Overflow or respective owner