I'm using backuppc and bacula together for a long time. The amount of
data to backup is growing massively lately (mostly large video files).
At the moment I'm using backup 2 tape for the large raid arrays. Next
year I may have to backup 300-400 TB. Backuppc is used for a small
amount of data, 3-4 TB.
We are looking for some high end solutions for the primary storage
right now (NetApp etc), but this will be very expensive. Most of the
data will be written once and then not touched for a long time. Maybe
not read again at all. There is also no need for a HA solution.
So I will also look into cheaper solutions with more raid boxes. I
don't see a major problem with this, except for backups.
Using snapshots with NetApp filers would be a very nice way to handle
backups of these large amounts of data (only delta is stored). Tapes
are more compicated to handle as backup to disk.
Does anyone have experience with using backuppc and these massiv
amount of data? I can't imagine a pool with x00 TB or using dozens of
backuppc instances with smaller pools.
Any thoughs? This might be a bit of topic, but if someone has a clever
idea I would be interested to hear!