[squid-users] HDD/RAM Capacity vs store_avg_object_size

bugreporter bugreporter2017 at gmail.com
Fri Jul 14 08:11:44 UTC 2017


Hi Alex,

By doing so I'll get a new (or the same) rough estimation which is not what
I'm really looking for. Actually I need to have a formula based on the mean
object size so I can periodically (with a cron) get the mean object size and
with the help of that formula reconfigure Squid accordingly. The
reconfiguration will be as follow:

- If the mean object size is too low compared to the RAM/HDD ratio then I
can reduce the HDD usage by Squid (cache_dir ...  Low-Mbytes-Size ...). A
reload of the new squid configuration should be sufficient. Isn't it? Or
I'll need to restart Squid?

- If the mean object size is to high compared to the RAM/HDD ratio then I
can fully use the HDD for Squid and do some optimizations (for instance as
the RAM will not be fully used by the in-memory index I can use it for the
cache_mem).

I need this industrialization to be able to install Squid on heterogenous
environments (with different mean object size) so I'll not have to propose a
new configuration whenever I install Squid.

Kind Regards,



-----
Bug Reporter Contributor
OpenSource = Open-Minded
--
View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/HDD-RAM-Capacity-vs-store-avg-object-size-tp4683072p4683098.html
Sent from the Squid - Users mailing list archive at Nabble.com.


More information about the squid-users mailing list