Truncated download with big file

I have a trouble trying to download a big file (4.59 GB) through a xojo web app.
For semplicity, I made a test app with the essential minimum code.
Using this code:

Session.wf = new WebFile
Session.wf = WebFile.Open(GetFolderItem("../../../file.iso",3))
Session.wf.ForceDownload = True
Session.wf.UseCompression = False
ShowURL(Session.wf.URL)

When the download is ready to start, the browser shows the dialog inwhich ask to me where to save the file and shows me its size (282 MB instead 4.59 GB).
Downloading the file, the file is truncated and I can’t open it.

Is this a 32 or 64-bit web app? Does the server have 4.59GB of RAM free?

Compiling 32 or 64, download is always truncated at 282 MB.
Refering the RAM free, you are right; the server has not 4,59 GB of RAM free.
So, to works fine the server must have a quantity of free ram as the size of the file I’m downloading to? It’s right?
This have a sense and I can increase the ram of the server.
But what is the sense of magical number 282 MB?

@Greg O’Lone you really implemented this without streaming from binarystream when you have a folderitem?

If you load it to memory it should raise exception early, shouldn’t it?

Side question, why do you use 3 as the second parameter in WebFile.Open? It expects a Boolean, why does it even compile?

[quote=341127:@Christian Schmitz]@Greg O’Lone you really implemented this without streaming from binarystream when you have a folderitem?

If you load it to memory it should raise exception early, shouldn’t it?[/quote]
I just tested, it throws an out of memory exception.

I tested to download a file of 1.72 gb, and it just worked. OSX with Firefox.

[quote=341127:@Christian Schmitz]@Greg O’Lone you really implemented this without streaming from binarystream when you have a folderitem?

If you load it to memory it should raise exception early, shouldn’t it?[/quote]
No, it streams. The problem is that we write data to to outgoing buffer faster than the OS transmits it, so it fills up.

A better solution would be to use Apache or another web server to deliver the file.

Is there a feature request or report that addresses that issue. It shouldn’t be too difficult to just send 256KB at a time and add another chunk when the buffer empties to less than 128KB or something similar. I’ve done that in a web server I wrote in RB back when and it would deliver multi gig files just fine.

I think you can ask the OS how much is in the outgoing buffer and only feed it, if it’s below a threshold.
That could be 10 MB big of course.

Well, it seems the problem its a Xojo’s lack, but it seems also there are some suggestions to address the lack.
I’m sure that the great team of Xojo Inc will solve the problem as soon as possible…meanwhile I’ll increase the server’s ram :slight_smile:

@Pietro Beccegato That won’t help, but 64-bit is more likely to work.

We already send in 256K chunks. It’s not our buffer that fills up… it’s the buffer on the OS.

Think of it this way… you have a plastic bucket with a 1cm hole in the bottom. Now fill that bucket at a rate of 256ml/s. Unless you have a really big bucket, it’s going to overflow if you dump 4.7 million liters of water in there unless you have a way to measure how fast the water is leaving the bucket and throttle the input.

Here’s the thing though… sending a file this big is going to kill the performance of your app. It’ll be a persistent high CPU process that could easily run for 20 or 30 minutes on a high bandwidth connection. I still think you’d be better off if you offload this process if they’re ISOs, put them on a CDN somewhere and just send a link.

I understand the reasons to serve up large files with a CDN. That’s not always a good idea though. Not every app is meant to server a giant public internet userbase. What about internal servers on a LAN.

I need to get out my old code and test it. I’m pretty sure I was able to send and receive files up to 4GB in size without issues in my old RB server from @2011.

Ok, latest code was updated in 2014.

This is in the sendProgressEvent.

if bytesLeft < 256 * 1024 and not getFile.EOF then
me.write getFile.read(512 * 1024)
end if

Last I checked this code works fine. I haven’t tested it in some time though as I only had one particular user that ran into issues. They used it for uploading and downloading large files internally.

I can check it again and see if it was only working just because I happened to have enough ram in my computer, or maybe it was because it was on the LAN and it was sending fast enough to clear the buffer before RAM filled up.

If I was just lucky, this seems like something that should be fixed. Otherwise that sendProgress event is full of false hope and deception. :wink: