From: Jonathan K. <jk...@cs...> - 2005-01-31 16:01:54
|
On Mon, 31 Jan 2005, Micha wrote: > But assumed i like to copy a 7 GB cache to another freshly installed > machine with the same OS (say debian sarge). And it would be much more > easy to cpio or tar the caceh dir over ethernet. Would it work ? What do > you think i'd have to consider ? It will cause recycling as the database is created/updated. Personally, I think an external utility that pretty much just fills in a database is kind of dumb. I also don't understand why recycling is needed. Yeah, I know what it does, I just don't understand why it needs to be a seperate process. It seems like all you have to do is have the code be something like: handle_request(file): if file doesn't exist: fetch_file_from_backend(file) set_atime(file, `date`) send_file(file) cleanup_cache(): foreach file in cache: atime = get_atime(file) if atime == None: atime = get_mtime(file) # mtime exists everywhere if (`date` - atime) > cleanuptime: unlink(file) delete_atime(file) This way the database is updated on demand. If the file has never been sent, then it gets deleted when its mtime (which is identical to its ctime) expires. Best of all it eliminates the need for arcane commands like apt-proxy-import, because the atime database just works. Of course, I'm still bitter that upon upgrading I went from a system that worked just fine, to a system that doesn't work nearly as well. Also, I don't like apt-proxy-import right now, because it doesn't work for me, because instead of just copying files and updating a database it's looking for backends it can't find for some reason. Right now I think it's trying to be too smart and do too many things. Don't bother about whether or not the file is current or not, just stick it in the cache, and let apt-proxy handle it. That's apt-proxy's job. -- Jonathan Koren World domination? I'll leave that to the jk...@cs... religious nuts and Republicans, thank you. http://www.cs.siu.edu/~jkoren/ -- The Monarch, "Venture Brothers" |