[squid-users] URL Cacheing Question
squid3 at treenet.co.nz
Mon Dec 14 19:05:49 UTC 2015
On 14/12/2015 11:07 p.m., Igor Dzombic wrote:
> Hallo Squid Team,
> i have one Question, i’m trying to Configure Squid-Proxy-Server and i need
> your help. It should work like this:
> - i should cache one URL (like
> http://backend.my-server.net/servlets/fgi/onepage.php) and refrashe it
> every 5 min. on a Squid-Proxy Server
> - My Clients should access ONLY this page and in a case that this
> page ist not reachable (no Internet-Connection), clients should load the
> Page from Proxy Cache
Firstly, cache != backup.
HTTP cache works by taking client requests and storing the responses
that have to be fetcehd rom a remote server. If the clients are not
using the proxy constantly when the server is working then the proxy and
its cache will not know what response object that URL is supposed to be
supplying when that server goes down. By then it will be too late.
=> This means you have to make the proxy be the place the clients go. Or
at least a large portion of them. Squid is the CDN frontend for the
The other thing about caches is that they have a very strong purpose on
delivering *accurate* content. That means that sometimes when the server
is down they will generate error page instead of delivering outdated
content, even if it is cached.
It is up to the origin server to explicitly specify how long cached
content is to be stored without even checking back to the origin
(Cache-Control s-maxage=N, max-age=N or Expires: header) and how long
after that expiry time happens it is allowed to be served even if the
origin it comes from has broken (Cache-Control:stale-if-error=N).
=> This means you have to make the origin deliver the required cache
control headers to make the cache store things for your 5min period, and
with N seconds failure hiding.
More information about the squid-users