From: Toby J. <pu...@to...> - 2002-12-09 03:32:14
|
> I think you can have the best of both here. What I mean is that > you can use the ConfigDef to define the parameters and their > types, structure and relationships, *and* you can maintain those > rich structures in the database schema via the Backup::Config > 'API'. If you had methods for creating/reading/writing/deleting > configuration objects, and the ability to parse the ConfigDef and > create the SQL dynamically based on it's structure, then you have > a very flexible and powerful configuration tool. I was thinking along these lines, but not going so far as to generate the tables themselves dynamically. In an absolutely monstrous BackupPC installation, we may be talking several thousand hosts. Figure around 100 config items per host, and that's still less than half a million records. I don't think space would be much of a concern in such an installation, but I'm sure the sorts of joins necessary to get all those config parameters would be rather time-consuming. So I think the one-table approach is probably best. toby |