I wrote a ashx handler that streams files to the browser in a secure way, I want users to be authorised to get to these files.
Problem is that when I steam big files (+40 MB), the session is gonen + the browser download suddenly interrupts after ~40 MB.
I have web.config configured not to time-out before 240 minutes.
testing this locally doesn't give me the same problem, testing this on my shared host does.
Anyone can point me in the right direction?
I tried with and without Reponse.Clear()
public void ProcessRequest(HttpContext context)
{
int id;
if (new Core.SecurityManager().CurrentUser != null)
{
try
{
id = Convert.ToInt32(context.Request.QueryString["id"]);
}
catch
{
throw new ApplicationException("id could not be parsed.");
}
string filename = new DocumentFactory().SelectDocumentById(id).Filename;
string filePath = context.Server.MapPath("~/uploads/" + filename);
//context.Response.Clear();
context.Response.AddHeader("content-disposition", "attachment; filename=" + filename);
context.Response.ContentType = "application/octet-stream";
context.Response.WriteFile(filePath);
//context.Response.Flush();
//context.Response.End();
}
else
{
throw new AuthenticationException();
}
}
Web.config:
<sessionState mode="InProc" cookieless="false" timeout="240"></sessionState>
Edit tried following, but still the download interrupts:
FileStream fs = new FileStream(filePath, FileMode.Open, FileAccess.Read);
byte[] byteArray = new byte[fs.Length];
using (MemoryStream ms = new MemoryStream(byteArray))
{
long dataLengthToRead = ms.Length;
int blockSize = dataLengthToRead >= 5000 ? 5000 : (int)dataLengthToRead;
byte[] buffer = new byte[dataLengthToRead];
context.Response.Clear();
// Clear the content of the response
context.Response.ClearContent();
context.Response.ClearHeaders();
// Buffer response so that page is sent
// after processing is complete.
context.Response.BufferOutput = true;
// Add the file name and attachment,
// which will force the open/cance/save dialog to show, to the header
context.Response.AddHeader("Content-Disposition", "attachment; filename=" + filename);
// bypass the Open/Save/Cancel dialog
//Response.AddHeader("Content-Disposition", "inline; filename=" + doc.FileName);
// Add the file size into the response header
context.Response.AddHeader("Content-Length", fs.Length.ToString());
// Set the ContentType
context.Response.ContentType = "application/octet-stream";
// Write the document into the response
while (dataLengthToRead > 0 && context.Response.IsClientConnected)
{
Int32 lengthRead = ms.Read(buffer, 0, blockSize);
context.Response.OutputStream.Write(buffer, 0, lengthRead);
//Response.Flush();
dataLengthToRead = dataLengthToRead - lengthRead;
}
context.Response.Flush();
context.Response.Close();
}
// End the response
context.Response.End();
When going straight to the file through the browser by adding the full path there is no problem what so ever downloading.
Although correct way to deliver the big files in IIS is the following option,
Set MinBytesPerSecond to Zero in WebLimits (This will certainly help in improving performance, as IIS chooses to close clients holding KeepAlive connections with smaller size transfers)
Allocate More Worker Process to Application Pool, I have set to 8, now this should be done only if your server is distributing larger files. This will certainly cause other sites to perform slower, but this will ensure better deliveries. We have set to 8 as this server has only one website and it just delivers huge files.
Turn off App Pool Recycling
Turn off Sessions
Leave Buffering On
Before each of following steps, check if Response.IsClientConnected is true, else give up and dont send anything.
Set Content-Length before sending the file
Flush the Response
Write to Output Stream, and Flush in regular intervals