Cloud storage with (s)FTP upload and HTTP(s) download

For an App I’m looking for Cloud Storage that allows uploading files using secure FTP and then downloading these files using HTTP(s) without manual actions or API communication.
I have the following requirements;

  • Affordable
  • Fast (Especially upload speed should not be capped)
  • Flexible (I need about 500GB to start and want to be able to expand to at least 1 TB

So far I have tried:

  • Amazon, which I found to be too slow for the large files I use (10MB to 200MB per file)
  • Strato Virtual Server (which caps ftp upload speed)
  • Strato HiDrive (which does not allow automatic HTTPs downloads of FTP uploaded files)
  • BrickFTP (which is fast, but they charge 100 a month for 500GB

Any suggestions would be welcome.
Thanks in advance for your reply.

Don’t you have a website where you could SFTP upwards and HTTPS downwards?

@Christian Schmitz : Yes, but that has some downsides;
On my local server, bandwidth is limited and vulnerable to power and ISP outages .
On my externally hosted websites, FTP upload speeds are capped to 3Mb per second, which is too slow.

try Server Warp

@Sunil Abraham : They have limited bandwidth and with 200MB files that would not work well.

If you’re interested, I can set you up on my server…


I even throw in the SSL certificate for free (Let’s Encrypt)

How about free?

No caps, but speed is 100 Mbps up and down

While I do offer the hosting for free, because there would be an extra cost for the space, if you want to pay that I can get you up to 4 TB all to yourself…

Bandwidth can always be expanded however you should fully understand your use case.

Even if you had a low end dedicated server with a 10mbps cap. You could still move 3.1 TB a month of data or 105.4 GB per day. If your average file is 200 MB then thats 525 files a day. That means you need at least 3.1 TB of storage to saturate a 10 mbps cap.

At ServerWarp none of our servers have less than 1gbps network speeds although providers and uplinks are different in different locations and vary with network conditions. However if you had a dedicated server with 1 gbps (10 gbps is available) you are talking 300 TB a month of transfer. Building a storage array that could handle that much transfer is more expensive then getting the network connection for it.

We have plans that go much higher if you click on our Linux page but let’s say the $100 plan with 4 TB. That’s 20,000 uploads of your 200 MB file per month. However that plan only includes 100 GB of fast SSD disk space and you need 4 TB to store it all.

So you see bandwidth is actually not the issue you face. It’s storage. Of course we can help there too.

@Phillip Zedalis : Thanks, I appreciate the analysis. The files would only stay on there for a few days, so storage would not need to be that large, but 100GB would likely end up being to small. I’ll send you an email with some additional questions.

Be careful what you sign up for, make sure you run the numbers for the type of service you’re providing, average concurrent users etc. If you expect each user to upload/download 1 or 2 times per day and files to only last a few days its a pretty easy calculation.

Theoretical throughput on a SATA3 connector is around 600MB/s. Depending on the service/hardware you opt for depends if you can exceed this. You might be sold 10Gbit/s but you wont be able to hit that unless your hardware can provide it or you cache to RAM. 1Gbit/s is around 128MB/s and is a lot easier to attain.

Don’t forget, 6000GB/month may sound a lot but on average over the month, its only approximately 18.25Mbit/s but your users might not be saturating your service so it might not matter that much.

I’m looking into the following at the moment: (various unmetered levels depending on price, can add more storage for $1/month per 50GB, so $20/TB) (cheap and fast but not much storage) (have used for years without issue)

I’m still evaluating things and I’ve not tested any of those for sftp. Scaleway you can test for cents as they price to the hour.

Here’s a few tools to help you: (last box at the bottom)

Of course, managed vs unmanaged is another story :slight_smile:

Good points Julian. You don’t typically use a SATA drive for 10gbps server that needs heavy uploading. Generally you set up a dedicated storage server/array i.e. SAN or otherwise that can handle the throughput. Other options are PCI-express based SSD cards and such. The problem is very solvable.