Blog Post - Amazon S3 Class for Xojo

It’s easy to use Amazon S3 in Xojo Apps with this S3 Class. Big thank you to @Tim Dietrich and @Christian Schmitz for the sample code!

http://bit.ly/XojoS3

Thanks.
Maybe a good chance to make a verbose version for the Magazine.

Thank you for this Hal. I’ve been trying to get to a project where I write a backup system and this class sounds like it accomplishes the offsite backup tasks simply.

Thanks! Tim and Christian did all the hard work. I just put a bow on it. :slight_smile:

I have a AWS S3 class that does not require any Plugins… it is 100% Classic Framework… and I will be providing to attendees as part of my session at XDC on AI/ML.

The class will:

  1. List all your buckets in specified region.
  2. List the files in a given bucket.
  3. Download a given flle from a bucket .
  4. Upload a file to a given bucket.

… so there is a lot of other things it does not do… like create buckets or set policies…

Hi all.
Im unsure what S3 offers.

The upload appears to be inexpensive.
As the seller of software which creates content, I sometimes hear from people who have managed to trash their machines , delete files accidently, or simply ‘lose’ files.

It seems that, using S3, I might offer a service where what they save is also uploaded to S3 , (encrypted or ‘linked’ to their registrations,) so that in the case of disaster, they (or I) could recover the lost files later.
Does this sound viable, or does an S3 account tie itself to one IP address per account?

Check out Cloud Object Storage - Amazon S3 - AWS

It is very powerful with features like version control, retention rules, etc…

You (and potentially your customers) will need an account… which is free until you reach certain levels or monthly usage.

[quote=425635:@Jeff Tullin]Hi all.
Im unsure what S3 offers.

The upload appears to be inexpensive.
As the seller of software which creates content, I sometimes hear from people who have managed to trash their machines , delete files accidentally, or simply ‘lose’ files.

It seems that, using S3, I might offer a service where what they save is also uploaded to S3 , (encrypted or ‘linked’ to their registrations,) so that in the case of disaster, they (or I) could recover the lost files later.
Does this sound viable, or does an S3 account tie itself to one IP address per account?[/quote]

for being the resident storage expert I will speak up.

Amazon S3 (and its clones/competitors) use a specialized REST API to store and retrieve data (aka files). There is no filesystem so there is no structure/organization. When you “create” a new object (think file), you get a very long UUID that is the ID for that object. To get that file back you request that object with that UUID.

Is your file(s)/object(s) tied to a single IP? nope. S3 doesnt care what IP you come from. It deals with security with the authentication/authorization tokens. The “Bucket” (think large folder or large collection of objects) is how you organize your files. an object can be only in a single bucket (technical you can write it yo multiple buckets but each bucket’s object ID will be unique as one bucket doesnt know about another bucket). By default you have a single bucket. but cant create more.

lets say you have an application that needs to download images from time to time (like holiday themed ones) or updated or whatever. you can store all these in a single bucket in S3 and have all your clients (around the globe) pull the updates out of S3. you just have to tell your app how to find the files in S3.

as @Kevin J Cully mentioned about his backup software, he can have all the various hosts dump files into a single bucket. I would recommend that each “customer” have their own bucket, and all their computers dump to that single bucket. So geeks like me would have a dozen or so computers dumping to my single bucket.

if anyone has any more questions about S3, please let me know.
–sb

I just updated the S3 Class so that it now encodes the S3 Key using EncodeURLComponent, without encoding the Slashes.

For example, the key:

Contacts/Hal Gumbert/PhotoURL/Hal Photo.jpg

would be:

Contacts/Hal%20Gumbert/PhotoURL/Hal%20Photo.jpg

It’s done via the newly added KeyEncode Method.

[quote=425635:@Jeff Tullin]Hi all.
Im unsure what S3 offers.

The upload appears to be inexpensive.
As the seller of software which creates content, I sometimes hear from people who have managed to trash their machines , delete files accidently, or simply ‘lose’ files.

It seems that, using S3, I might offer a service where what they save is also uploaded to S3 , (encrypted or ‘linked’ to their registrations,) so that in the case of disaster, they (or I) could recover the lost files later.
Does this sound viable, or does an S3 account tie itself to one IP address per account?[/quote]
Yes, and if you know that backups are going to be accessed infrequently you can save money on storage costs.

For our system the major benefits of S3 are the fact that you don’t have to manage your storage anymore and not really have worry (too much) about backing up / data loss. Some of our customers have received uploads that total over 450GB in a single day which would be difficult to manage in a traditional data centre environment.

I updated the S3 Class FolderItemUpload method so it now assigns the Content-Type Header. It extracts the extension of the filename from the Key and then uses the extension to get the mime type. I also added a commented out Header to set the ACL to ‘public-read’. It might be a good idea to add the Content-Type and the ACL as optional parameters.

Added to S3.FolderItemUpload:

Dim Headers() As String Headers.Append( "Content-Type: " + EncodeURLComponent( FileNameExtensionToMimeType( FileNameExtension( pKey ) ) ) ) 'Headers.Append( "ACL: public-read" )