Zitat von Radosław Korzeniewski <radoslaw@...>:
> 2012/11/5 <lst_hoe02@...>
>> No, we have around 3.5 billion files with some jobs with not problem,
> Wow, 3.5 _billion_ files on a single job. Amazing.
Sorry, confused by the million/billion (short/long scale). Should have
been more precise with 3.5 x 10^6 (german million not milliarde)
>> but we use PostgreSQL as backend db and don't do base jobs.
> Could you share some information about your setup. It is very interesting
> how Bacula and PostgreSQL can handle this kind of _huge_ backup job. What
> hardware? What archive device (disk? tape? both?). I can imagine that you
> should have no less then 10 _billion_ files in your catalog. The largest
> Bacula deployment I ever see was a few hundred million files in a whole
> catalog, not a single job.
So we are, depending on the short/long scale you have in mind by
factor 1000 or 1000000 smaller than expected ;-)
The real numbers are only 1.45 TB and 3.500.000 files for one machine.
The others are somewhat smaller but there is still another with
The whole catalog is around 7GB as of now, so no problem.
So let's repeat: I will never use ambiguous "billion" again...