What kind of back-up strategy do you have?

I’m working with a fairly small project.
Sometimes I come to an “end” witch may not be the end of the day or the week.
I mean, when the Big Problem is finally solved or tuned!

Then I have a complete and working version.
I’m sure many people work like this.

How do you continue?
Do you quit Xojo, make a folder or back-up to put the complete and running version of your project…?
I mean, I just wonder what kind of strategy other people use!?

I don’t have a clear strategy. I back up my entire computer from time to time…

I use Git on my local machine as a source control system. A source control system goes a VERY LONG way in protecting your hard word, and you can also easily “rollback” to previous versions of your program if you want to undo changes. Most source control systems can be used while Xojo is open.

I then regularly backup the Git repository (every second day or so) to a removable SSD drive. My projects are small enough to back everything up to a 1TB.

Having your backups stored offsite is also first prize of course.

I would highly recommend that you look into source control system such as Git. Another option is to look into cloud systems such as GitHub. With GitHub you can regularly push your code to the code. So when your machine crashes you don’t have to loose a millisecond of sleep.

CORRECTION: …in protecting your hard work, and…

Looked at Git and just can’t work out how to get it going.
Wish there was something for the Mac as simple as Sourcesafe was on the PC …install, start a project, drag the files in… move on.

So I’m living without source control at the moment.
I use Time machine backups (godsend), and I periodically copy the whole folder to a named (versioned ) backup folder on an SD card I keep in the side of the laptop.
And change control comments in the source to let me know what version got what changes

I keep my projects in Dropbox so I can open them on whatever machine I’m working on, that also keeps them backed up on all my devices. I use file history on my main development machine (sort of like time machine on the Mac) to back up my Dropbox folder, then for any major work I do I use git to commit my changes to bitbucket (usually once a day).

I don’t like to lose work and short of all out nuclear Armageddon, I believe I’m covered with this strategy.

P.S. I’ve heard that some have problems using Dropbox as a development folder so ymmv if you use this strategy.

I usuallly do nothing until it’s too late - and then cry like a baby after I lose 10 years worth of stuff!
Yes, that recently happened to me!

My backup drive failed, then before I could replace it - my main drive also failed :frowning:

I can recommend Arq (http://www.haystacksoftware.com/arq/) which is a nice utility for online backups with amazon and SFTP as server options.
This way you eliminate the company in-between and you have your own server or amazon S3 account.

And of course each Mac should at least have one Time Capsule.

For the project files, I have a backup folder and I make a copy of the project frequently; usually before restart working on it.
In the backup folder, I add as a suffix the SQL date of the last modification.

Good question (IMHO):
What are you doing with comments when a major step have been done (and after making a backup copy of the project) ?

Do you remove every comments that pertains to test before getting a good running code ?
Do you let all the comments where they are ?
Do you add comments where needed ?
(where you do not add them, yet).


Good comments usually explain what you were thinking at the time you wrote the code making it easy to get back into the same frame of mind hours days or years later. If I change my mind or refactor something, I usually modify any nearby comments to reflect my current train of thought.

If I’m working on a project with multiple programmers then, in addition to inline comments, we usually have a section at the top of the method that explains any changes made, along with date and programmers initials.

I have a cheapo local branded external HDD (live in Taiwan) which gets used for Time Machine. Avoid any HDD which says made in Korea, had several and most die within a month of the warranty expiring.

I also have a 64GB USB stick which I use with Backup To Go (which is freely available on the Mac App Store). I use this several times a day, especially at the end of the day.

Once a version is complete, I then use Arbed to bring in all the external resources and make a zipped copy.

Take the time to learn git, it will be worth it. Learn to commit in small, functional changes, not one huge commit at the end of the week. Push to an offsite source code repository such as github or bitbucket. Then also have multiple local backups, for example I have a locally attached hard drive and a hard drive attached over the network that my system backs up to hourly via TimeMachine.

Two more steps… I feel a network attached HD is pretty important. I know two people now who have done a great job backing up only to come home after being out and have had their computer AND FW/USB attached hard drive stolen. Tough luck for them. Their computer and backup gone.

Finally, really important things, take off site. I know this sounds like something you do at a business, but not so. Most of us have thousands upon thousands of pictures on our computer, our photo albums. What happens in a fire? All my source code is in remote repositories somewhere, most of my important spreadsheet/word docs, etc… I could live w/o. Pictures/Movies of my kids growing up? Trade with a trusted brother, sister, mom, dad, friend or someone. Each of you buy an external hard drive. They backup to theirs, you backup to yours and then swap. Each month, you erase the HD, backup to whichever you have and swap again. You’re teaching them a valuable thing and both providing a valuable service to each other.

  1. Cornerstone as SVN with external items for Xojo.
  2. Internal secondary harddisk for nightly mirroring. Saved my bacon last week again because the primary harddisk went faulty. TimeMachine lacks control.
  3. Crashplan. Yes, I know about the NSA, but it’s only 50$ a year for unlimited storage.
  4. External harddisks which get’s the most important data and are swapped more or less regularly to a bank vault.#

Yes, I’m paranoid about my data.

Thank you!
It was very interesting to see and to read all kinds of solutions and strategies you all have out there!

I’ve never paid so much attention to this before because when developing for the web, all code is uploaded to the web-server and things are stored in two places by default…

But when working with Xojo, it’s completely different!
I’ll think of something… I don’t know what. Maybe the USB stick to begin with!? But they are both slow and terrible small… they get lost instantly!

But true, I need to shape up my life in this sense! :slight_smile:

I now have 2 USB sticks. Every time I make a major change, I copy it to both USB Sticks.

One attaches to my keyring (so I always have my software with me).
The other gets put in my wardrobe (as a backup of my backup)


On my home network I have a few rackmount servers…two of them are NAS. One runs FreeNAS and another runs Solaris 10. I rsync FreeNAS to my Solaris 10 server once a month or two unless there is alot of data thats been added. My laptop I back up in the same method…when I think I have alot of stuff that needs to be sync’d. If so, laptop gets sync’d to the FreeNAS box and the cycle continues.

as someone that does Backup & Recovery for a living, there is two stages of “backup”.

first is use something like git (I prefer git myself). That way you can take snapshots of your work whenever you make a goal or have something complete within your project. You can take as many snapshots as you wish. git also allows you to push the changes to a remote site so you have offsite copy of the data.

git is good for the source code. For the desktop/laptop for all teh apps, docs, etc everything but the source code, I would use a backup software. Crashplan works extremely well and fairly cheap. It is FREE if you backup going from your computer to another computer (friends or yours).

for my computers I use the git/crashplan combo and have never lost any data. If you have questions about either, please let me know.

We run a Subversion(SVN) server on a NAS at the office. Every fourth hour it does a copy of the repositories to an off-site NAS.

  1. Source control of some kind. I use both Subversion and Git (BitBucket is free and easy to set up) with remote repositories. Commit functional changes at least daily.
  2. Local backup. I use Time Machine to an internal drive in my Mac Pro. Runs once per hour.
  3. Remote backup. I use Crashplan. Runs every 15 minutes.
  4. Drive clones. I use Carbon Copy Cloner.
  5. Microsoft OneDrive, Box and DropBox for files I want to easily access anywhere.

1.) No Source Control
2.) Backup manually

a.) ChronoSync every time I made some progress (whith Archiv Management to store old Revisions) to a DROBO 5D with Dual Disk Redundancy
b.) CarbonCopyCloner Drive Clone every week or so to a DROBO 5D
c.) CarbonCopyCloner Drive Clone every month or so to a bootable USB drive