[squid-users] Rate limiting bad clients?
Amos Jeffries
squid3 at treenet.co.nz
Tue Aug 9 16:23:34 UTC 2016
On 9/08/2016 5:39 p.m., Dan Charlesworth wrote:
> Hi all,
>
> This is more of a squid-adjacent query. Hopefully relevant enough for someone here to help…
>
> I’m sick of all these web apps that take it upon themselves to hammer proxies when they don’t get the response they want, like if they have to authenticate for example. On big networks, behind a forward proxy, there’s always a few computers with some software doing dozens of identical, failing, requests per second.
>
> - What’s a good approach for rate limiting the clients computers which are doing this?
> - Can anyone point to a good tutorial for this using, say, iptables if that’s appropriate?
>
> Any advice welcome.
HTTP being stateless Squid does not track correlations enough to do
anything like that.
I've been suggesting people add an external ACL helper that tracks
requests per client IP and tells Squid whether to accept or reject any
given request based on its history.
Recent Squid releases bundle ext_delayer_acl which is a Perl script that
can be adjusted for this type of thing.
Amos
More information about the squid-users
mailing list