Like many others, I’ve been having trouble with unreliable WebFileUploader. Although small uploads usually work, larger uploads routinely fail, and the issue seems to be related to the network speed and/or latency, e.g. the exact same file upload will succeed on fast WiFi but fail over a slow cellular connection.
In the /tmp folder, the uploaded file is present, but is about 90% complete.
I’m pretty sure this is a bug inside Xojo with socket or multipart-mime handling.
In any case, I realized that the file is mostly there, it’s just the end bytes that are missing.
So here’s the hack:
Override the XojoUploaderEngine by using Javascript inside of Session.Open. The altered javascipt simply adds a second copy of the upload file to the upload queue.
On the server side, whenever UploadProgress hits 100%, start a timer
When the timer fires, if UploadFinished hasn’t fired, then we go hunting inside the /var/folders temporary items to find the partial upload. Repeat as necessary
Because we included the same file twice in the upload queue - only the second copy of the file upload is truncated. The first copy is fine. With some multipart-mime hacking, we extract that first copy and save it to disk.
This is a super ugly hack, but I’ve been testing it in production for 48 hours, and it has reduced my # of failed uploads to zero.
Update: I’ve been running this fix for about 48 hours, and so far it’s performing perfectly. Of approximately 200 uploads, about 65% of them failed initially. However with the fix in place, in every case the upload failed, the file was able to be recovered successfully. End result: 100% success rate on uploads.
The uploads are Word and PDF files ranging from a few KB up to about 5MB in size, with a mix of users using macs, windows, android and iOS devices on their home networks.
Oh. I didn’t see the “any longer”, the “and longer…” tricked my understanding. But I don’t know how @Mike_D can and you don’t, using the provided samples and the “how to”.
ahh, typo … Well, I could in October reproduce it (too often) but not always. So far I was not yet able to reproduce it … I’m very interested in the topic as for years I had issues with uploading fies to a webserver, it now seems to work, but this i really a feature which needs to be rock solid.
The biggest was 21 MB. Could you work with some kind of MD5Digest to check if the file was successful. Are you zipping the files beforehand? I made the subjective observation that those seem to upload more consistently.
It’s an unfortunate issue and a waste of time and I have again the “feeling” that it differs from browser to browser. Safari seems to have more problems than others. I know this is all sounds weird, but I can’t really tackle it down to one particular file or constellation. All I can say, that my tests lately were luckily successful. Even an Edge transfer of approx. 1 MB succeeded which in that past always(!) failed.
Besides the reproducible steps in the case being Safari based, If you read the case, you will see that other people are seeing the problem of never completing event after reaching 100% using Chrome too (Hamish and Adam) and Hamish added that a timeout event was not firing too.
Yes, I am probably just lucky, or silly, or who knows what. I can’t emphasise how much I’m interested in this bug being solved. Almost all my webApps need to upload some files and this has never worked properly. I will probably only know when I will roll out in a few weeks an update of one of my webApps :-(.
I just noticed the following: upload working again fine in Chrome, but uploadComplete not firing when uploading to my webServer via Safari (mine file 54 KB), working fine though if run in Safari locally. Safari Version 14.0.1 (16610.2.11.51.8) on Big Sur 11.0.1