WebFile explodes at 70 Megs

Since 2013, I have been happily using a cgi program closely based on the Downloading Web example.

Problem is, today I uploaded a program created with 2016R3 which comes up, once packed in the installed and in a zip file, to 71.8 MB.

When I try to download though the program, it freezes for a couple seconds, before ending up in Error 500.

I am not going to blame Xojo Web. After all, Webfiles were probably intended for picture files and stuff that is not as big.

But still, I have a problem : how am I going to work around this one ?

Any idea welcome.

Are you trying to protect the original file, or can you just redirect to it?
I have PHP that might help for protected downloads (it powers the HTML Edit unique downloads)

I had to move away from WebFile and build my own file handler because WebFile would load the entire file in memory first. Get a couple users downloading the same file and your server will run out of RAM and crash.

[quote=311819:@Tim Parnell]Are you trying to protect the original file, or can you just redirect to it?
I have PHP that might help for protected downloads (it powers the HTML Edit unique downloads)[/quote]

That is the whole point. All my software is off the web space, and the delivery software makes it available.

I would like to keep the same software, but to replace the webfile. Phillip, how do you manage that one ?

[quote=311822:@Michel Bujardet]That is the whole point. All my software is off the web space, and the delivery software makes it available.

Phillip, how do you manage that one ?[/quote]

You can’t with default Xojo Web. You end up having to rely on a different web server like PHP like Tim is doing. You could build a PHP script that only gives access to the file if the query string includes a valid session ID or something. PHP/Apache are better at file delivery.

Xojo Web only raises the ‘HandleURL’ method after the incoming request is complete. You then have to return True after writing all the data to the response so you end up in the same scenario.

For a very large SaaS project that we have deployed using Xojo I built my own web server to handle large files. I can now move multi-gigabyte files without the RAM ever going up more than 10mb. This doesn’t help you but ultimately WebFile is very limited because Xojo Web embedded server is very limited.

Have you tried running it stand alone? There could be an issue with that much data running through cgi.

I simply don’t want the overhead of standalone. I prefer cgi.

I will probably go the PHP route for the download step. It is not very difficult to showURL the PHP script when needed.

Not sure if it will help your situation, but WebFile.Open has parameter to not load the entire file into memory:

http://documentation.xojo.com/index.php/WebFile.Open

That is interesting. Let me do some experiment. Thank you Paul.

OK. With the parameter not to load everything in memory, the program stays unresponsive for about 6-7 seconds, then I get a bad gateway 502.

Visibly Xojo Web has trouble with big files.

Can’t you create the webfile in the App.Open or Session.Open event, and maintain a reference so when the user will download it will be ready in advance. (Ofcourse with the WebFile.Open inMemory = False)

The app does one thing : verify the customer credentials, and trigger the download. So indeed it creates the webfile immediately before offering it for download. I see no other way.

That said, I looked at the php script, and it seems pretty easy to point to it for download.

I will have to modify my Xojo program for that, but nothing major.

For clarity, the problem here is actually the socket itself. When you specify InMemory=False, the WebFile merely contains a FolderItem. When the framework requests it, the contents of the file are transferred to the socket a little at a time.

The problem comes in because sockets are sort of like filling a water bucket with a small hole in the bottom while wearing a blindfold. You can pour data into the top as quickly as you want, but the output speed is limited and much slower than what you put in. Unfortunately because you can’t see the bucket, you also don’t know how slowly it’s emptying and can’t slow the rate at which you add to it.

How this problem can be avoided using only Xojo code?

It is way easier in Php, but it seems possible to do something through HandleURL using BinaryStream.

Your solution is to either use an external web server (as is being discussed) or to not allow users to download large files from within your app which would result in your app running out of memory.

Internally we will need to revisit this in the new web framework.

I absolutely need to let users download : they paid for the product…

It is not a huge issue, though. The PHP script is fairly simple to do.

Do you need a feature request ?

No

Thank you Greg.
I’ll see what the new framework can do about this.

Regards.