[squid-users] Caching mirrored origin server
Alex Rousskov
rousskov at measurement-factory.com
Thu Jan 3 16:34:34 UTC 2019
On 1/2/19 3:01 PM, jimc wrote:
> I'm using squid-4.4-2.1.x86_64 from OpenSuSE Tumbleweed. My goal is
> that when doing periodic software updates. each host in my department
> will contact my proxy to obtain the new metadata and packages (SuSE has
> a syntax for this); the proxy will download each file only once. This
> sounds like pretty standard Squid operation, but there's a gross botfly
> in the ointment: the origin servers return 302 Found, each time
> redirecting to a different mirror, and with "normal" configuration this
> result is passed back to the client which makes a new connection (via
> the proxy) to that mirror, but the retrieved file will likely never be
> accessed again from that mirror.
The default solution for mapping many URLs to a single cache hit is the
store_id helper. That solution is only applicable to URLs that produce
the same content regardless of the URL from the set.
* https://wiki.squid-cache.org/Features/StoreID
* http://www.squid-cache.org/Doc/config/store_id_program/
> mirrors come and go, and curating the mirror
> selection turned out to be a reliability problem, so I gave it up.
If there is a common pattern to all mirrors for a given URL, then
store_id can help. You would still be responsible for curating the URL
mapping/patterns, of course, but it may be (more) manageable.
There is no built-in "follow redirect but cache under the original URL"
feature in Squid, probably because such a feature would result in
serving wrong responses in many typical cases. With store_id, the
decision to map URLs and the headaches/risks of doing so are all on your
side.
With some Squid development work, the missing feature can be implemented
on top of the existing adaptation interfaces and the core store_id
functionality, but nobody has done that so far IIRC.
HTH,
Alex.
More information about the squid-users
mailing list