From: Daniel D. <dan...@gm...> - 2007-12-28 23:17:14
|
Bryan Penney wrote: > The original document I quoted was for an older version, but I found > one for 2.9.1 and is still says it doesn't understand hardlinks > > http://www.seas.upenn.edu/~bcpierce/unison//download/releases/unison-2.9.1/unison-manual.pdf > > > I've copied a much smaller pool (150GB) using rsync when we first went > to a production server. > > Both of the servers have 2GB of RAM. > After I get the drives for the new server, I will try rsync. It will > be interesting to see how long it takes to copy all of this data with > all of those hardlinks. > > thanks for the help. > > Bryan > > > > On 12/28/2007 4:50 PM, dan wrote: >> no it wouldnt, but i though it did. is that statement for an older >> version? it may just not handle it. rsync should work if you have >> enough RAM >> >> On Dec 28, 2007 3:10 PM, Bryan Penney < bp...@mu... >> <mailto:bp...@mu...>> wrote: >> >> In reading about Unison I found a statement in the Caveats and >> Shortcomings section that said "Unison does not understand hard >> links" >> >> If this is true, would Unison work in this situation? >> >> On 12/28/2007 2:28 PM, dan wrote: >> > no, you will have to copy the entire 'pool' or 'cpool' over. you >> > could copy individual pc backups, BUT when backuppc nightly >> runs it >> > will remove any hardlinks from the pool that are not needed >> > elsewhere. when you copy over pc backups after that, the will >> not use >> > hardlinks and so your filesystem usage will go up a lot. i >> would very >> > much suggest you do it all in one shot. >> > >> > i know that time is against you on this and that 2TB even over >> gigabit >> > is 5 hours so i would suggest that you rsync the files over >> once and >> > leave your other machine up running backups, then once it has >> > finished, turn backups off and rsync the source to the target >> again. >> > then you will have the bulk of the data over and only have to pull >> > changes. i worry about the file count for 2TB being too much for >> > rsync so consider Unison for the transfers. In my reading i have >> > found that though unison has the same issue as rsync(same >> algorythms) >> > for a high number for files, it can handle more files in less >> memory. >> > >> > I have done this method to push about 800GB over and it worked >> well, >> > but my backup server has 2GB of RAM and runs gigabit. >> > >> > maybe consider adding some network interfaces and channel bonding >> > them. i dont know if you have parts lying around but channel >> bonding >> > in linux is pretty easy and you have agrigate each NICs >> bandwidth to >> > reduce that transfer time though i suspect that your drives are >> not >> > much faster than 1 gigabit NIC so you might not get much >> benefit on >> > gigabit. >> > >> > >> > >> > On Dec 28, 2007 10:17 AM, Bryan Penney <bp...@mu... >> <mailto:bp...@mu...> >> > <mailto: bp...@mu... <mailto:bp...@mu...>>> wrote: >> > >> > We have a server running BackupPC that has filled up it's 2TB >> > partition >> > (96% full anyway). We are planning on moving BackupPC to >> another >> > server >> > but would like bring the history of backups over without >> waiting the >> > extended period of time (days?) for the entire pool to copy. >> Is there >> > any way to copy "pieces" of the pool, maybe per PC, at a >> time? This >> > would allow us to migrate over the course of a few weeks >> without >> > having >> > days at a time with no backups. >> > >> > >> > >> ------------------------------------------------------------------------- >> >> >> > This SF.net email is sponsored by: Microsoft >> > Defy all challenges. Microsoft(R) Visual Studio 2005. >> > http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/ >> <http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/> >> > _______________________________________________ >> > BackupPC-users mailing list >> > Bac...@li... >> <mailto:Bac...@li...> >> > <mailto:Bac...@li... >> <mailto:Bac...@li...>> >> > List: >> > https://lists.sourceforge.net/lists/listinfo/backuppc-users >> > <https://lists.sourceforge.net/lists/listinfo/backuppc-users >> <https://lists.sourceforge.net/lists/listinfo/backuppc-users>> >> > Wiki: http://backuppc.wiki.sourceforge.net >> > Project: http://backuppc.sourceforge.net/ >> <http://backuppc.sourceforge.net/> >> > <http://backuppc.sourceforge.net/> >> > >> > >> >> > a long time. you got gigabit? |