[squid-users] Regex optimization
yvoinov at gmail.com
Thu Jun 16 20:28:44 UTC 2016
-----BEGIN PGP SIGNED MESSAGE-----
I propose to nominate for the second place of the contest "The most
inefficient use of computing resources - 2016." :-!:-D
Because first place already occuped. :-D 30 millions pornsites in one
squid's ACL and 7 minutes for squid -k refresh. 8-)
17.06.2016 1:20, Antony Stone пишет:
> On Thursday 16 June 2016 at 21:11:50, Alfredo Rezinovsky wrote:
>> Well.. I tried.
>> I need to ban 8613 URLs. Because a law.
> Have you considered https://www.urlfilterdb.com/products/ufdbguard.html ?
>> If I put one per line in a file and set the filename for an url_regex acl
>> it works. But when the traffic goes up the cpu load goes 100% (even using
>> workers) and the proxy turns unusable.
> Er, I'm not surprised.
>> I tested and saw my squid can't parse regexes with more than 8192
>> I managed to combine the 8000 uris in 34 regexes using a ruby gem,
>> cpu load stays almost at the same level it is without any acl (same
> That must be *way* past anything to be described as "maintainable".
>> the regex is:
> Er, thanks, that confirms my suspicions above :)
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2
-----END PGP SIGNATURE-----
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 2437 bytes
Desc: not available
More information about the squid-users