Large File Downloads via Web possible?

I’d be good for any solution to share files via a web page. It’d could be a Web App or a Desktop App. I just need let folks download big files from a web interface.

The Example Projects/Web/XojoCloudFileManager has the following code that also spontaneously quits when sending a big file. Is there a way to make that work?

[code] Dim f As FolderItem = SelectedFile

If f <> Nil Then
mDownloadFile = WebFile.Open(f, False)
mDownloadFile.ForceDownload = True

Call mDownloadFile.Download

End If[/code]

In the Example Projects/Communication/Internet/Web Server/WebServer.xojo_binary_project

I tried adding a flush, but that just seems to stop the download early…

bs = BinaryStream.Open(f, False) while not bs.eof writeCount = writeCount + 1 Self.Write( bs.Read( 1024 ) ) if writeCount > 10 then self.Flush writeCount = 0 end if wend bs.close

a flush on a socket is often a bad idea as it may delay things.
Best may be to have files on apache web server and just tell browser to download right file.

[quote=160489:@Hal Gumbert]I’d be good for any solution to share files via a web page. It’d could be a Web App or a Desktop App. I just need let folks download big files from a web interface.

The Example Projects/Web/XojoCloudFileManager has the following code that also spontaneously quits when sending a big file. Is there a way to make that work?

[code] Dim f As FolderItem = SelectedFile

If f <> Nil Then
mDownloadFile = WebFile.Open(f, False)
mDownloadFile.ForceDownload = True

Call mDownloadFile.Download

End If[/code][/quote]

In HTML, you would simply point a link to the file in a webspace, such as http://mydomain.com/myfile.zip

I believe you can simply do the same with a showURL.

Thanks for everyone’s help on this. I’m really bummed that Xojo can’t do this. I had a few apps that need this ability… :frowning:

I’m still struggling with being able to download large files from a Xojo app. I don’t care if it’s a desktop app or a web app. Christian submitted Feedback case 37770 about this.

In the meantime, is there a way with Xojo that I could take a file and segment it so I could send the segments and then un-segment them? Maybe using gzip or something like that? I’ve been perusing the docs and haven’t found anything yet. :frowning:

I have never written a server, but can you do anything with the socket BytesLeftToSend property so as not to overload it.

PS. If you really want trouble use .flush. On Windows it can cause havoc, and I have seen comments on Java and C++ forums to the same effect. I tried recently to flush an incoming socket where there was no data, and none to come, and the program just went into a terminal hang.

a solution could be to put file in same folder as web app with special name and let apache serve the file.
e.g. with a random file name, so people can’t predict name.

For this app, I can’t assume that the user has apache installed. My app needs to do everything…

I don’t mind slicing the file into segments, passing the segments and then putting them back together. Is there a way to do that in Xojo?

Wow! Things not going well. I tried modifying the example server project by putting a timer in, checking how much data in the send buffer, and reading more if required into my own buffer. ( Win 7 PC ). It worked fine until about 400-500MB of a 2.6GB file, then locked up each time. The timer was not being executed and the server program had to be clobbered. Changing timer period to slow it did not fix it.

To answer your segment issues - you could just do that on the server by reading chunks of the file in either sequential socket use, or multiple sockets. At the client side you could reassemble the chunks, either by passing each socket’s received data through a single routine ( control in HTTP headers if you like ) or by writing the segments and binarystreaming them into one. I’d build in checksums though to make sure what you get is what you sent.

But I am now wary of server sockets in RB/Xojo, at least until I can find if I was the problem ( often the case LOL ).

I tried a few more things, including sending more only in the sendcomplete, but it still locks up at 500MB or so. A bit achey flakey I think.

EVERY one of the file download examples for both Web Apps and Desktop apps crash on large file downloads. It works great for most image files since they are normally small.

I just found via Mac OS Terminal I can call:

zip -r -s 100m ArchiveName.zip TheFileToZip.foo

Will zip the file and segment it in 100mb chunks.

Is there a way to do the same on Windows?

Many ways via a shell command.

Some examples:

  1. RAR command line https://www.feralhosting.com/faq/view?question=36
  2. 7zip ( my favourite as it can unpack many formats ). http://sevenzip.sourceforge.jp/chm/cmdline/switches/volume.htm
  3. One version of zip. http://www.info-zip.org/mans/zip.html

You might look at zlib http://www.zlib.net/ but check 64-bit support for large files.

I always post and then think of more things. Senility strikes!

The way I did it with 7zip was to write out a text batch file of the 7zip command line, then use the launchandwait function in the WFS to run the batch file and hang on until the command was done, then process. This was with unpacking though. More cleverer types on forum might be able to tell you a better way. :wink:

And again. Looking through old programs you can also use a shell in mode 0.

Just make sure any .exe files and .dll files are present.

Thanks Peter!

I found this one too: https://dotnetzip.codeplex.com/releases/view/68268

I think I want to stick with zip since it’s built into Mac OS. I wish I didn’t have to go this way. :frowning:

Neat… When I run the Mac OS X terminal command, I can see it creating the individual zip files with the extensions “z##” one by one… Like .z02, .z03, etc…

So I could estimate the number of segments and show the user a progress bar when as it’s prepping and probably start sending the segments as later segments are being written!

I was a little surprised at this thread so I tested it myself. It really does consume as much memory as the file itself even setting the flag not to load the file in memory.

In a server I wrote I did the following for handling larger files.

getFile = f.OpenAsBinaryFile
s = getFile.Read(64 * 1024)
me.write responseHeaders.construct + s

then in the send progress event of the socket

if bytesLeft < 64 * 1024 and not getFile.EOF then
  me.write getFile.read(128 * 1024)
end if

I played around with different values for the sizes and settled on those as they seemed to work fairly well for my usage.

I don’t how the http server is structured in Xojo so I have no idea if this would be a simple thing to implement or not, but I think it would be nice to have as it would limit the amount of memory needed for serving files. It might also be good to offer the option of changing the chunk size for tuning purposes. If you’re on a local network you might want 1MB or 5MB chunks to fully utilize the available bandwidth.

Well, if anyone can produce an example for Windows that does not lock up I would be very keen to see it.

@Kevin Windham I did try various methods of reading in only pieces of the file but the server still locked up after about 500MB.

@Hal Gumbert that Dotnetzip looks rather nice.