Large File Downloads via Web possible?

Why not put together an API wherein the client asks for bytes X through Y of the file. That way, it can download any arbitrary size chunks. Getting that portion of the file should be easy with a BinaryStream.

Tim, would that not load the entire file into memory on either the sending or receiving side?

I’d be more than willing to pay for some working code that would send huge folderitem files from one computer to another.

No. The sender would just read chunks of a manageable size and the receiver would write the chunks one by one.

Out of interest, I use an oldish PC download manager called ReGet. This program first gets the file size off the HTTP header, and then writes a blank file of that size. Then it opens up multiple connections ( if the server permits that ), dividing the download into however many connections you specify ( I use 8 or 12 depending on if I feel the family need a bit of bandwidth :wink: ) and writes the pieces into the correct place in the pre-made file.

Torrents work in that way too after a fashion, using chunks.

Are these client programs getting all your big files from one central place, or how do you work?

I have an HTTP server that can handle large file uploads and downloads. The only problem it has that I know of is it will crash with SSL on when used with Chrome. I’ve set it aside hoping that the bug in the server socket would get fixed. I guess I need to test it out again to see if that’s still an issue. If not, maybe I can dust it off and simplify it for this specific use.

From rudimentary tests I conducted a while ago, you can perfectly well use HandleURL to open a BinaryStream and Request.Print chunks of data. I have been able to present entire HTML sites that way.

I wonder if that would not be the solution for really big files, since instead of loading the whole file in memory, BinaryStream could just do it by using Read(numberofbytes). Then you Request.print the chunk you just got. You will probably need to slow down the Read/Print routine not to overflow the clients buffer, especially given the size of what you send.

Unless BinaryStream loads everything in memory, that could be the way to serve the file.

The app will let the person share their files with other folks without uploading them to a server. Essentially, it’ll be a specialized file server.

[quote=182586:@Hal Gumbert]Tim, would that not load the entire file into memory on either the sending or receiving side?
[/quote]
As others have said, it will only load the requested chunk of data, not the whole file. That’s what BinaryStream is designed for: random access to a small piece of a large file.

I wish I had time to put something together.

I’m going to try and do it now. Wish me luck!

I went back and looked at the code I wrote before I ran into the problem… Someone sent me a demo file a while back that I must have failed to look in to it or I’m going insane… I must be losing my mind.

It seems to be working. It reads the files in portions like you guys mentioned. Then it converts that data into base64 and adds a header. The order side does the reverse…

Thank you for all the help!

Strange one. I managed to get an old Mac running today and copied off a simple file send/receive program I last looked at in 2002, when sockets were sockets and that is all you had. Just by “resolving supers” and changing 10 lines of code, and using int64s instead of integers, I managed to send myself a 3GB file no problem at all using RB. The program has only one socket for receiving, set to listen, and one for sending, and you can use it to send to yourself ( as opposed to others, which are possible too ). Under Win7 32 bit.

So now I don’t know…