I've inherited a C# .NET application which talks to a web service, and the web service talks to an Oracle database. I need to add an export function to the UI, to produce an Excel spreadsheet of some of the data.
I have created a web service function to run a database query, load the data into a DataTable and then return it, which works fine for a small number of rows. However there is enough data in the full run that the client application locks up for a few minutes and then returns a timeout error. Obviously this isn't the best way to retrieve such a large dataset.
Before I go ahead and come up with some dodgy way of splitting the call, I'm wondering if there is already something in place that can handle this. At the moment I'm thinking of a startExport function then repeatedly calling a next50Rows function until there is no data left, but because web services are stateless this means I'm going to have to keep some sort of ID number around and deal with the associated permissions. It would mean that I don't have to load the entire data set into the web server's memory though, which is one good thing.
So if anyone knows a better way to retrieve a large amount of data (in a table format) over aN ASMX web service, please let me know!
We had this exact business scenario a few years ago, and I'll tell you what we did.
- Try to limit the amount of data transferred
- If you're transferring n tables, split them into n datatables and transfer one datatable at a time.
- Compress your dataset/datatable before transferring. This makes a huge (huge huge) impact. On the other side, decompress the byte stream back into a dataset/datatable. Don't use .NET's built in compression either - use SharpZipLib. It gives much better results.
- Also, you can perform the transfer asynchronously to keep the client from locking up.
Our customers have been using the above solution, without issue, for years.
I suspect you just want to call the web service asynchronously, so your application doesn't block on UI while it's all going on. Take a look at http://msdn.microsoft.com/en-us/library/ms233842.aspx and see if it helps.
I'd recommend offloading the task to SSRS, which handles long-running queries without a problem. You can build a report with your report builder, then use your .NET app to invoke the SSRS web services run your report and export the results back in Excel format (as a byte array). See this page for sample code using the .asmx service, and this page for invoking SSRS with WCF.
You should also call your webservice in a separate thread to avoid tying up your UI.
I recommend WCF Data Services (or whatever it's called now). You can expose an oData endpoint which (if you have .NET on both sides of the webservice) gives you transparent request throttling (the server limits the number of responses in a single call, and the client automatically makes additional requests as necessary). It also gives you an
IQueryable-style interface which is extremely flexible.
See "Large Data and Streaming" in the MSDN Library.
Do not use ASMX web services if you're interested in performance of large data transfers.
The problem is common : you have a lot of data to return to the user.
If you are lucky, you may get away by increasing the webservice timeout. Otherwise, would a solution of returning a pageful data at a time back to the user suits your requirements? In this design, the caller would be responsible for specifying the start page, and how many rows of data per page to your webservice.
If however, you are not restricted to using webservice as a solution, then you might like to investigate offering Reporting Services to your users. Definitely worth your while to investigate this alternate option :)