I have the following method for talking a list of a model and converts it to a csv string and writes it to an HttpResponseMessage:
private HttpResponseMessage ConvertToCsvFileResponse(IEnumerable<VolumeMetric> filterRecords, string fileName)
{
var csvBuilder = new StringBuilder();
//write the header
csvBuilder.AppendLine("SourceSystem,BillingYear,BillingMonth,NewVolumeBytes,IncrementalVolumeBytes");
// Write the data lines.
foreach (var record in filterRecords)
csvBuilder.AppendFormat("{0},{1},{2},{3},{4}{5}", record.SourceSystemReference, record.BillingYear, record.BillingMonth, record.NewVolumeBytes, record.IncrementalVolumeBytes, Environment.NewLine);
// Convert to Http response message for outputting from api.
HttpResponseMessage result = new HttpResponseMessage(HttpStatusCode.OK);
result.Content.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
result.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment"); //attachment will force download
result.Content.Headers.ContentDisposition.FileName = fileName;
result.Content = new StringContent(csvBuilder.ToString());
return result;
}
This method is called from my API controller and the response is output to the client as follows:
[HttpGet]
[Route("{year}")]
public HttpResponseMessage GetByYear(int year)
{
try
{
var entries = _auditTableStorage.ListEntities<VolumeMetric>("VolumeMetrics");
return ConvertToCsvFileResponse(entries.ToList(), $"VolumeMetrics_{year}.csv");
}
catch (Exception ex)
{
return new HttpResponseMessage(HttpStatusCode.InternalServerError) {
Content = new StringContent($"Problem occurred connecting to storage: {ex.Message}")
};
}
}
I generate the data by querying Azure Table Storage - but that's kind of irrelevant for my question. The data comes from a data source in a paginated format. My current approach is to gather all the data together first (paging through until the end) to generate a list and then running the method at the top to output as CSV.
As you can imagine, this works well for small to medium size data sets BUT quickly becomes inefficient for large datasets.
Ideally, I'd like to modify my code to take chunks of the CSV string and stream that in the file down to the client making the request - just not sure how to convert what I have into a streamed version. Any points greatly appreciated!
Aucun commentaire:
Enregistrer un commentaire