[squid-users] Large Files Not Caching
ranger at opennms.org
Thu Nov 12 17:54:51 UTC 2015
On 11/12/15 12:35 PM, Antony Stone wrote:
>>> I'm trying to set up a CDN-like frontend to our (bandwidth-constrained) >>> master package repository. Everything seems to be working
(including >>> memory cache hits) except for some reason it does not
seem to be >>> caching/keeping large files. > Define "large"?
Sorry. To back up a little:
squid version: 3.4.8-6+deb8u1 (debian jessie)
With that config, I see memory hits to the cache, working fine.
However, if I try to download something that's a couple of MB, it never
writes to either cache directory.
I get this in the store.log:
> 1447350253.330 RELEASE -1 FFFFFFFF 41BD9B4385C540AB29F252B7B7DDF41C > 200 1447350184 1447185078 1447954984 application/x-rpm 2368070/2368070
> GET >
...and this in the access.log:
> 1447350253.330 70000 2606:a000:45e2:1200:f0cb:6c0a:1e57:68bd > TCP_MISS/200 2368590 GET >
> - TIMEOUT_FIRSTUP_PARENT/220.127.116.11 application/x-rpm
On a second hit, I get the same thing, RELEASE and TCP_MISS.
>>> Attached is my configuration. Is there something obvious that I'm missing? >>> maximum_object_size 600 MB > I assume you don't mean "it's not
caching stuff bigger than 600 Mb"
Hah, no. The goal is to cache the most popular RPM and Debian packages
and to spread the load out geographically. Most of them are somewhere
between 20-300MB. Unfortunately, right now it seems to only cache what
fits in memory.
Also, sorry, just noticed this after I'd already reply-all'd:
> Please reply to the list; > please *don't* CC me.
I won't do it again... :/
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the squid-users