From: Piet B. <pie...@ya...> - 2002-12-31 20:19:03
|
I generally do a 'wget' to do the dirty work of generating those 512 byte wide images ahead of time, instead of when the visitor gets to the site. This may take a long time if you have a lot of images, or a lot of big images, especially on a slow computer. At least the images get stored in the image-cache directory. The command to recursively get up to 8 levels of links from a site is: wget -r -np -l 5 URL (where "URL" is your home page) (You can use the --delete-after option of wget to have the images and files deleted immediately after download) --- Scott Sawyer <ss...@sc...> wrote: > I would concur with that assessment. > > I also notice that for the most part people only look at the 512 sizes > of the images. If the previewmaker could create those as well then that > would solve a lot of the speed issues that I see. > > I do have though over probably 10k images at this point and that will > eat up a lot of disk space. The whole speed for space problem. > > -Scott __________________________________________________ Do you Yahoo!? Yahoo! Mail Plus - Powerful. Affordable. Sign up now. http://mailplus.yahoo.com |