[squid-users] How to have squid as safe as (e.g.) firefox?

Amos Jeffries squid3 at treenet.co.nz
Thu Aug 13 06:06:10 UTC 2015

On 13/08/2015 9:20 a.m., Jeremie Rafin wrote:
> Hello,
> If I browse on the internet **without** a proxy like squid, and if I
use a browser like firefox, the certificate management of SSL
connections looks, as far as I feel it, safe and secure.


Technically a clean install of Squid with default options is more secure
than any browser you will be able to find.

Because it comes configured for forward-proxy. Which does not touch the
TLS traffic in any way but relays CONNECT tunnels. Inside the tunnel the
browser<->server security is in operating control, which makes the Squid
relay equally secure as whatever the browser and server would agree to
without Squid.

Additionally, Squid enforces that HTTPS tunnels only go to port 443.
Which is something the browsers do not do. Making Squid better in this
one way on top of all the things the browsers do inside their tunnel.


 the feeling of security you have with browser is a lie. Pure "security
theatre", done so well that you and billions of others can't even see it.

What you are doing is trusting a very large set of nearly a thousand CA
entities. Including most of those governments with bad reputations now
for surveillance, and a lot of corporations with agendas of their own.
For all sorts of reasons which you have no knowledge or control over.
Yes, someone has vetted that their published intentions were good to get
them into the list, but it was not you. For you and everyone else it is
almost blind trust.

At any time *one*, just one, of them could sign a faked certificate.
When that happens no user will be able to tell the difference without
detailed digging down into the browser cert information.

The only reason these things come to light at all is when the ability is
abused in obvious user-visible ways for dictatorial censorship or
malware attacks. Or the few vigilant an knowledgable people actively
seeking it out catch it in the act. Its not the CA action that was seen
first, but the abuse of power it allowed.

Thankfully the repercussions of being wiped from the global CA list are
severe enough to prevent power abuse in amost cases. But there have been
some exceptions even so.

So security threatre. Its been working so far, but only just.

> One of my favorite web pages to test this is:
> https://revoked.grc.com/ Going on this site must generate an error
> such as a "revoked
certificate" reject.
> But, if I browse with squid "behind", configured with SSL bumping
> and
host certificate generation (in such a way my proxy works well for
https), this site (https://revoked.grc.com/) is **not** filtered. Which
is, to my eye, a big security hole...

For anything other than blind relay to have happened in Squid with
regards to HTTPS you have to configure it to happen. We try to make the
defaults secure in a reasonable way that works for general use. Just
like the browser people do.

You have configured certificate generation. By definition that means
that a *newly generated* certificate is being used to the browser. Not
the one that was revoked.

You have also configured "sslproxy_cert_error deny all" which forces
Squid to accept and ignore all possible origin server certificate
errors. Including revocation.

I hope you can see/understand the result.

> Questions (I am searching for answers for several months, without
> -while using squid, is it possible to have a SSL/HTTPS level of
security at least as high as with a reference like firefox (assuming
this is a reference; in my humble opinion, regarding certificate
management, it is, as I don't know better)?


> -do you know any implementation of NSS library (the security library
of firefox, probably safer than openssl) for certificate checking helper
(cf. sslcrtvalidator_program)?

No. Just the OpenSSL one we provide so far.

I'm still working on getting library-agnostic TLS support rolled into
Squid. But the main effort has been on the squid binary, not the helpers

> -how to manage certificate lists, especially automatic updates of
> them
(e.g. use of OSCP inside squid helpers)? Could we access to tweaks like
maximum acceptable age of certificate validity, white and black lists of
trust authorities, exclusion of autosigned certificate, etc?

The OSCP specifications and library documentation is the best place to
look for that kind of thing.

How the helper determines valid/invalid is intentionally outside the
squid operating scope.

Within squid itself we use the library API lookup which hides/abstracts
that detail.

> Thanks for any help, any suggestion! Jérémie
> PS1: some of test web pages, for which, to my mind, security fails with squid:
> -https://revoked.grc.com (my "favorite"; must fail browsing)
> -https://www.ssllabs.com/ssltest/viewMyClient.html (to get a big picture, especially if OCSP stapling is active)
> -https://www.howsmyssl.com/ (not as good as previous; provides another point of view)
> PS2: my squid 3.5 works on a debian wheezy 7.6; here is my
> squid.conf
(only my adds in top of the default file content); so far I try to have
transparent (implicit) proxy but explicit proxy is not better (only
simpler configuration):
> # SSL bumping configuration
> http_port 3126 intercept
> https_port 3127 intercept ssl-bump generate-host-certificates=on dynamic_cert_mem_cache_size=4MB cert=/etc/squid/ssl_cert/myCA.pem
> sslcrtd_program /usr/local/squid-3.5/lib/squid/ssl_crtd -s /var/spool/squid3_ssldb -M 4MB
> # SSL Options - to mimic firefox; some of keys are weaks but some of my favorite websites need them :(
> sslproxy_options NO_SSLv2,No_Compression

Careful. Squid will do what you tell it to.

In order for this to be more secure than the browser, you will have to
ensure that each of these things you are allowing actually are more
secure than what it does. And that you are not allowing anything that
the browser decides is bad.

> sslproxy_cert_error deny all
> # Splice access lists
> acl splice_client src
> acl splice_domain dstdomain .paypal.com
> acl splice_dst dst
> # HTTPS access

Nope, "TLS access" is better description.

HTTPS is two protocol layers; a HTTP layer over a TLS layer (like
"TCP/IP" is actually TCP over IP layer).

ssl_bump directive controls only the TLS later actions. The http_access
rules later deal with the decrypted HTTP layer - but only if it was
allowed to be decrypted ("bump" action) by these rules.

> ssl_bump splice splice_client

Splice is equivalent to blind tunnelling, but can be done after Squid
has played with the certififcates a bit (using read-only accesses).

Since splice_client is based only on src-IP and nothing TLS layer
related it is better to use "none" instead of splice action on the above
rule. The true secure blind-tunnel will then be done.

> ssl_bump splice splice_domain

This is a good example of how dstdomain fails. You are deciding whether
to splice instead of interpret the HTTP message. Based on details inside
that HTTP message which has not yet been interpreted.

Make sure you are using the latest 3.5 release, and use the
"ssl::server_name" insted of dstdomain in the ACL definition.

> ssl_bump splice splice_dst

> ssl_bump server-first all

DO NOT mix the old and new config styles together. server-first requires
doing things like emitting a fake server cert to the client before
reading soem of the client handshake details the splicing needs. But you
have already spliced a bunch of transactions from the earlier rules.

> # Hide PROXY
> via off
> forwarded_for delete

Does *not* hide the proxy.

Hides the *client* by actively "shouting" the proxy details out to the
world in protocol places where the client details would normally go.


More information about the squid-users mailing list