[squid-users] block user agent

Amos Jeffries squid3 at treenet.co.nz
Thu Nov 16 09:33:34 UTC 2017


On 16/11/17 21:29, Vieri wrote:
> 
> ________________________________
> From: Amos Jeffries <squid3 at treenet.co.nz>
>>
>>> The following works:
>>>
>>> acl denied_useragent browser Chrome
>>> acl denied_useragent browser MSIE
>>> acl denied_useragent browser Opera
>>> acl denied_useragent browser Trident
>>> [...]
>>> http_access deny denied_useragent
>>> http_reply_access deny denied_useragent
>>> deny_info http://proxy-server1/proxy-error/?a=%a&B=%B&e=%e&E=%E&H=%H&i=%i&M=%M&o=%o&R=%R&T=%T&U=%U&u=%u&w=%w&x=%x&acl=denied_useragent denied_useragent
>>>
>>> The following works for HTTP sites, but not for HTTPS sites in an ssl-bumped setup:
>>>
>>> acl allowed_useragent browser Firefox/
>>> [...]
>>> http_access deny !allowed_useragent
>>> deny_info http://proxy-server1/proxy-error/?a=%a&B=%B&e=%e&E=%E&H=%H&i=%i&M=%M&o=%o&R=%R&T=%T&U=%U&u=%u&w=%w&x=%x&acl=allowed_useragent allowed_useragent
>>>
>> The User-Agent along with all HTTP layer details in HTTPS are hidden
>> behind the encryption layer. TO do anything with them you must decrypt
>> the traffic first. If you can decrypt it turns into regular HTTP traffic
>> - the normal access controls should then work as-is.
> 
> 
> So why does my first example actually work even for https sites?

If you are decrypting the traffic, then it works as I said exactly the 
same as for HTTP messages.

If you are not decrypting the traffic, but receiving forward-proxy 
traffic then you are probably blocking the CONNECT messages that setup 
tunnels for HTTPS - it has a User-Agent header *if* it was generated by 
a UA instead of an intermediary like Squid.

> 
> acl denied_useragent browser Chrome
> acl denied_useragent browser MSIE
> acl denied_useragent browser Opera
> acl denied_useragent browser Trident
> [...]
> http_access deny denied_useragent
> http_reply_access deny denied_useragent
> deny_info http://proxy-server1/proxy-error/?a=%a&B=%B&e=%e&E=%E&H=%H&i=%i&M=%M&o=%o&R=%R&T=%T&U=%U&u=%u&w=%w&x=%x&acl=denied_useragent denied_useragent
> 
> If the above "works" then another way would be to use a negated regular expression such as:
> acl denied_useragent browser (?!Firefox)
> but I don't think it's allowed.

AFAIK that feature is part of a different regex grammar than the one 
Squid uses.

PS. you do know the UA strings of modern browsers all reference each 
other right?  "Chrome like-Gecko like Firefox" etc.

Amos


More information about the squid-users mailing list