I work on a team of three – five developers that work on an ASP.net web application remotely. We currently run a full local database from a recent backup on all of our machines during development. The current backup, compressed, is about 18 GB.
I’m looking to see if there’s an easier way to keep all of our local copies relatively fresh without each of us individually downloading the 18 GB file over HTTP from our web server on a regular basis. I guess FTP is an option, but it won’t speed the process up at all.
I’m familiar with torrents and the thought keeps hitting me that something like that would be effective, but I’m unsure of the security or the process.
6
Does your database support incremental backups or log shipping? If so, you might try one of those, and just update the database instead of doing a complete restore. This won’t give much advantage if your DB structure changes a lot, but if it’s mostly data changes then this could save you a good deal of time (and network bandwidth).
rsync or some other copy-only-differences system. Deltas between two subsequent db dumps ought to be pretty similar.
Do they need all the data?
If yes, what they can do is creating the database but empty, then add some data they need to be able to work. Like this, they save bandwidth and time.
Or when you perform backups, you create one but you remove the unnecessary part. It will make your backup lightweight and faster to download.