From: Bill M. <mo...@ha...> - 2005-05-17 02:22:27
|
apt-proxy version: 1.9.28 apt-proxy has its own cache of packages. Also, /var/cache/apt/archives is still used on each of the client machines. So, if the apt-proxy machine is also a client then there can be two copies of the package -- one in /var/cache/apt/archives/ and the other in /var/cache/apt-proxy/. Is this correct? Before installing apt-proxy I would run this via cron: apt-get -qq update && \ apt-get -qqd dist-upgrade && \ apt-get -qq autoclean That keeps my Packages list up to date and pre-fetches packages, so when I do a dist-upgrade the packages are ready to be installed. So, with using apt-proxy I can use the same method as above to pre-fetch packages (via cron during the night), but to avoid storing the packages in two locations on the same machine I would use "clean" instead of "autoclean". Then later when I run a dist-upgrade I'll fetch the packages from the apt-proxy cache. Is the above correct? Is that a typical usage for pre-fetching packages? Next, I can't see to import my existing .deb into the apt-proxy cache. This seems to happen for every package: 2005/05/16 09:32 PDT [-] [import] m4_1.4.3-1_i386.deb skipped - no suitable backend found 2005/05/16 09:32 PDT [-] [import] libxml2-dev_2.6.16-7_i386.deb skipped - no suitable backend found I just installed apt-proxy and ran apt-get "update" on the machine and then tried apt-proxy-import -i /var/cache/apt/archive and receive the above. Here's my backends: [debian] ;; Backend servers, in order of preference backends = http://ftp.us.debian.org/debian http://ftp.de.debian.org/debian http://ftp2.de.debian.org/debian ftp://ftp.uk.debian.org/debian [debian-non-US] ;; Debian debian-non-US archive ;timeout will be the global value backends = http://ftp.uk.debian.org/debian-non-US http://ftp.de.debian.org/debian-non-US ftp://ftp.uk.debian.org/debian [security] ;; Debian security archive backends = http://security.debian.org/debian-security http://ftp2.de.debian.org/debian-security -- Bill Moseley mo...@ha... |
From: Chris H. <ha...@de...> - 2005-05-20 13:09:48
|
On Monday 16 May 2005 17:53, Bill Moseley wrote: > So, if the apt-proxy machine is also a client then there can be two > copies of the package -- one in /var/cache/apt/archives/ and the other > in /var/cache/apt-proxy/. Is this correct? Yes, correct > Before installing apt-proxy I would run this via cron: > > apt-get -qq update && \ > apt-get -qqd dist-upgrade && \ > apt-get -qq autoclean > > That keeps my Packages list up to date and pre-fetches packages, so > when I do a dist-upgrade the packages are ready to be installed. > > So, with using apt-proxy I can use the same method as above to > pre-fetch packages (via cron during the night), but to avoid storing > the packages in two locations on the same machine I would use "clean" > instead of "autoclean". Then later when I run a dist-upgrade I'll > fetch the packages from the apt-proxy cache. > > Is the above correct? Is that a typical usage for pre-fetching > packages? Yes, that would be fine. One feature I'd like to do when I have time is to implement the prefetching in a-p itself. We already store last time a client requested each package, so it would be possible to automatically download updates of all packages that clients have been downloading recently. Now, I just need some time, or someone, to implement it :) > Next, I can't see to import my existing .deb into the apt-proxy > cache. This seems to happen for every package: I accidentally broke apt-proxy-import and max_versions a while ago. 1.9.30 fixes this, please upgrade. Chris |
From: Bill M. <mo...@ha...> - 2005-05-20 14:05:24
|
On Fri, May 20, 2005 at 02:02:49PM +0100, Chris Halls wrote: > One feature I'd like to do when I have time is to implement the prefetching in > a-p itself. We already store last time a client requested each package, so > it would be possible to automatically download updates of all packages that > clients have been downloading recently. Now, I just need some time, or > someone, to implement it :) Of course, what I do that pre-fetch for is to do fast upgrades on sid, which if you wait a week there's quite a few. Another feature I would love is to be able to query apt-proxy and 1) show when a package was installed (and what packages were installed because of it) and 1) be able just see (and sort) by package installation date ("what were those three CMS packages I was testing out last week?"). > > Next, I can't see to import my existing .deb into the apt-proxy > > cache. This seems to happen for every package: > > I accidentally broke apt-proxy-import and max_versions a while ago. 1.9.30 > fixes this, please upgrade. Ah, thanks. Ok, trying.... No, still not working. How does the import work? Does it look at the file name, then try to find that file in Packages.gz (or related .db) and look up its path? So for libdb1-compat: 2005/05/18 21:41 PDT [-] [import] libdb1-compat_2.1.3-7_i386.deb skipped - no suitable backend found [...] finds it in this Packages.gz bumby:/var/cache/apt-proxy$ zgrep libdb1-compat ./debian/dists/unstable/main/binary-i386/Packages.gz Depends: libdb1-compat Package: libdb1-compat Filename: pool/main/d/db1-compat/libdb1-compat_2.1.3-7_i386.deb And then knows to place it in pool/main/d/db1-compat/? Here's what's happening, and my two config files: bumby:/home/moseley# apt-get install apt-proxy bumby:/home/moseley# apt-get update bumby:/home/moseley# apt-cache policy apt-proxy apt-proxy: Installed: 1.9.30 Candidate: 1.9.30 Version Table: *** 1.9.30 0 500 http://bumby unstable/main Packages 100 /var/lib/dpkg/status bumby:/home/moseley# apt-proxy-import -i /var/cache/apt/archives Updating twisted's process module. No updating required. 2005/05/20 06:57 PDT [-] Log opened. 2005/05/20 06:57 PDT [-] [apt_pkg] Loading Packages database for /var/cache/apt-proxy/.apt-proxy/backends/security 2005/05/20 06:57 PDT [-] [apt_pkg] Loading Packages database for /var/cache/apt-proxy/.apt-proxy/backends/marillat 2005/05/20 06:57 PDT [-] [apt_pkg] Loading Packages database for /var/cache/apt-proxy/.apt-proxy/backends/debian 2005/05/20 06:57 PDT [-] [apt_pkg] Loading Packages database for /var/cache/apt-proxy/.apt-proxy/backends/debian-non-US 2005/05/20 06:57 PDT [-] [import] libdb1-compat_2.1.3-7_i386.deb skipped - no suitable backend found 2005/05/20 06:57 PDT [-] [import] libtext-charwidth-perl_0.04-2_i386.deb skipped - no suitable backend found [...] :r /etc/apt-proxy/apt-proxy-v2/.conf [DEFAULT] ;; All times are in seconds, but you can add a suffix ;; for minutes(m), hours(h) or days(d) ;; Server IP to listen on ;address = 192.168.0.254 ;; Server port to listen on port = 9999 ;; Control files (Packages/Sources/Contents) refresh rate ;; ;; Minimum time between attempts to refresh a file min_refresh_delay = 1h ;; Minimum age of a file before attempting an update (NOT YET IMPLEMENTED) ;min_age = 23h ;; Uncomment to make apt-proxy continue downloading even if all ;; clients disconnect. This is probably not a good idea on a ;; dial up line. ;; complete_clientless_downloads = 1 ;; Debugging settings. ;; for all debug information use this: ;; debug = all:9 debug = all:4 db:0 ;; Debugging remote python console ;; Do not enable in an untrusted environment ;telnet_port = 9998 ;telnet_user = apt-proxy ;telnet_password = secret ;; Network timeout when retrieving from backend servers timeout = 15 ;; Cache directory for apt-proxy cache_dir = /var/cache/apt-proxy ;; Use passive FTP? (default=on) ;passive_ftp = on ;; Use HTTP proxy? ;http_proxy = host:port ;; Enable HTTP pipelining within apt-proxy (for test purposes) ;disable_pipelining=0 ;;-------------------------------------------------------------- ;; Cache housekeeping ;; Time to perform periodic housekeeping: ;; - delete files that have not been accessed in max_age ;; - scan cache directories and update internal tables cleanup_freq = 1d ;; Maximum age of files before deletion from the cache (seconds) max_age = 120d ;; Maximum number of versions of a .deb to keep per distribution max_versions = 3 ;; Add HTTP backends dynamicaly if not already defined? (default=on) ;dynamic_backends = on ;;--------------------------------------------------------------- ;;--------------------------------------------------------------- ;; Backend servers ;; ;; Place each server in its own [section] [debian] ;; The main Debian archive ;; You can override the default timeout like this: ;timeout = 30 ;; Rsync server used to rsync the Packages file (NOT YET IMPLEMENTED) ;;rsyncpackages = rsync://ftp.de.debian.org/debian ;; Backend servers, in order of preference backends = http://ftp.us.debian.org/debian http://ftp.de.debian.org/debian http://ftp2.de.debian.org/debian ftp://ftp.uk.debian.org/debian [debian-non-US] ;; Debian debian-non-US archive ;timeout will be the global value backends = http://ftp.uk.debian.org/debian-non-US http://ftp.de.debian.org/debian-non-US ftp://ftp.uk.debian.org/debian [security] ;; Debian security archive backends = http://security.debian.org/debian-security http://ftp2.de.debian.org/debian-security # -- [marillat] backends = ftp://ftp.nerim.net/debian-marillat ;;[ubuntu] ;; Ubuntu archive ;; backends = http://archive.ubuntu.com/ubuntu ;;[ubuntu-security] ;; Ubuntu security updates ;; backends = http://security.ubuntu.com/ubuntu ;[openoffice] ;; OpenOffice.org packages ;backends = ; http://ftp.freenet.de/pub/debian-openoffice ; http://ftp.sh.cvut.cz/MIRRORS/OpenOffice.deb ; http://borft.student.utwente.nl/debian ;[apt-proxy] ;; Apt-proxy new versions ;backends = http://apt-proxy.sourceforge.net/apt-proxy ;[backports.org] ;; backports.org ;backends = http://backports.org/debian ;[blackdown] ;; Blackdown Java ;backends = http://ftp.gwdg.de/pub/languages/java/linux/debian ;[debian-people] ;; people.debian.org ;backends = http://people.debian.org ;[emdebian] ;; The Emdebian project ;backends = http://emdebian.sourceforge.net/emdebian ;[rsync] ;; An example using an rsync server. This is not recommended ;; unless http is not available, becuause rsync is only more ;; efficient for transferring uncompressed files and puts much ;; more overhead on the server. See the rsyncpacakges parameter ;; for a way of rsyncing just the Packages files. ;backends = rsync://ftp.uk.debian.org/debian :r /etc/apt/sources.list # Normal Sid sources #deb http://http.us.debian.org/debian/ unstable main non-free contrib #deb-src http://http.us.debian.org/debian/ unstable main non-free contrib # (apt-proxy) deb http://bumby:9999/debian unstable main non-free contrib deb-src http://bumby:9999/debian unstable main non-free contrib # non-free #deb http://non-us.debian.org/debian-non-US unstable/non-US main contrib non-free #deb-src http://non-us.debian.org/debian-non-US unstable/non-US main contrib non-free # (apt-proxy) deb http://bumby:9999/debian-non-US unstable/non-US main contrib non-free deb-src http://bumby:9999/debian-non-US unstable/non-US main contrib non-free # Experimental #deb http://http.us.debian.org/debian/ experimental main non-free contrib #deb-src http://http.us.debian.org/debian/ experimental main non-free contrib #deb http://non-us.debian.org/debian-non-US experimental/non-US main contrib non-free #deb-src http://non-us.debian.org/debian-non-US experimental/non-US main contrib non-free # For VLC (VideoLan.org) #deb http://download.videolan.org/pub/videolan/debian $(ARCH)/ #deb-src http://download.videolan.org/pub/videolan/debian sources/ # http://www.musicpd.org/forum/viewtopic.php?t=170 #deb http://www.yhbt.net/normalperson/debian/MusicPD/releases ./ # transcode #deb ftp://ftp.nerim.net/debian-marillat/ unstable main # Cinelerra deb http://www.kiberpipa.org/~minmax/cinelerra/builds/sid ./ deb-src http://www.kiberpipa.org/~minmax/cinelerra/builds/sid ./ # Updated slimp3 deb http://www.litux.org/debian unstable/ -- Bill Moseley mo...@ha... |
From: Chris H. <ha...@de...> - 2005-06-09 10:34:57
|
Hi Bill, sorry for the long delay On Friday 20 May 2005 15:05, Bill Moseley wrote: > Another feature I would love is to be able to query apt-proxy and 1) > show when a package was installed (and what packages were installed > because of it) and 1) be able just see (and sort) by package > installation date ("what were those three CMS packages I was testing > out last week?"). With apt-proxy using twisted, it should be too difficult to generate extra web pages, although the access to the databases needs to be cleaned up first - there isn't any well documented set of access functions. > Ah, thanks. Ok, trying.... No, still not working. I agree, it's still not working :( It seems my existing install started working after the fixes but when I tried on a completely new cache directory, it didn't work either. > How does the import work? Does it look at the file name, then try to > find that file in Packages.gz (or related .db) and look up its path? I'll try and explain the complete flow, starting from an empty cache: 1. A client requests a Packages[.gz] file for a given backend [in apt-proxy daemon:] 2. The Packages file is downloaded 3. If it was Packages.gz, the file is uncompressed using gzip in the background. This was the step I broke :) 4. The uncompressed Packages file is registered in the database found at /var/cache/apt-proxy/.apt-proxy/backends/<backend name>/packages.db [in apt-proxy-import:] 5. apt-proxy-import is started 6. for each .deb, a-p-i extracts the package name 7. for each backend, a-p-i extracts a list of Packages files from the database 8. the contents of each Packages file is read using python-apt 9. a-p-i looks in the python-apt cache for the package name 10. if the exact version is found in a Packages file, it copies the file there 11. otherwise if a package with the same name but a different version is found, it will copy the file to the same path as the other version. So, going back to your original question, it looks at the package name embedded in the .deb, and tries to find that path in the Packages.gz. I have added a command line option to specify the configuration file, to make it easier to start a new a-p with an empty cache listening on a different port, but it needs some cleaning up before it is released. In my debugging I have established that step 8 is failing for some reason: The python-apt cache ends up being empty. I'm not sure yet why this is happening. Chris |