Menu

#4 Differential backup for large files

open
nobody
None
5
2014-04-03
2014-04-03
Zaid Amir
No

Sorry to bother you with this but it seems that I have reached a dead end with my research and would appreciate some help.

I am responsible for the design and development of a cloud backup application. Currently we are using xDelta to do differential backup so that only the changed parts of the files gets uploaded. This cuts cost for both us the company and the user. One issue we are having with this approach is that in order to do a differential backup the modified file needs to be compared to the original file. To solve this we opted for a quick solution, that is to cache the original files on the users' computer for a quick reference.

Now this solution works brilliantly for desktop/home users where the file sizes are relatively small. The problem arise when say a server administrator tries to backup an sql database, for example, that is 100GB in size. Though currently we cache the file on disk, this is definitely not appreciated by server administrators since we are filling their drives with cached files and simply just duplicating their used space.

I am wondering if there is any other approach to do differential backups without the need of having the original files on disk permanently.

Discussion


Log in to post a comment.