[squid-users] Cert authority invalid failures.

Amos Jeffries squid3 at treenet.co.nz
Thu Apr 21 17:29:47 UTC 2016


On 22/04/2016 2:36 a.m., Markey, Bruce wrote:
> acl internal src 192.168.200.0/21
> acl wireless src 192.168.100.0/23
> 
> acl Safe_ports port 80
> acl Safe_ports port 443
> acl SSL_ports port 443
> acl CONNECT method CONNECT
> 
> acl allowed dstdomain -i "/etc/squid3/acls/http_allowed.acl"
> acl prime dstdomain -i "/etc/squid3/acls/squid-prime.acl"
> acl ips dst -n "/etc/squid3/acls/broken_ips.acl"
> acl blocked dstdomain -i "/etc/squid3/acls/http_blocked.acl"
> 
> http_access allow allowed
> http_access allow ips
> http_access deny blocked
> http_access deny prime

It would seem that it could only be these two ACLs blocked or prime
which are causing your denial.

Note that when dstdomain is passed a raw-IP the reverse-DNS will be
looked up and if that domain OR the raw-IP match an entry in the list it
will be a match.

Note2 that when a connection arrives at your "intercept ssl-bump" port
Squid will generate a CONNECT message using the raw-IP of that
connection and pass it through your http_access controls. The denial
could happen then.

Once the SNI is found by the peek another pass of the http_access
happens. Denial could happen then as well.
Then your splice kicks in.

> 
> http_access allow internal
> http_access allow wireless

Move these ...

> http_access deny !Safe_ports
> http_access deny CONNECT !SSL_ports

... default security controls back up to the top of your config.

> http_access deny all
> 
> acl step1 at_step SslBump1
> acl step2 at_step SslBump2
> acl step3 at_step SslBump3
> 
> acl broken_sites ssl::server_name_regex "/etc/squid3/acls/http_broken.txt"
> ssl_bump peek !broken_sites
> ssl_bump splice all
> 
> sslproxy_capath /etc/ssl/certs
> 
> sslcrtd_program /lib/squid3/ssl_crtd -s /etc/squid3/ssl_db -M 4MB
> sslcrtd_children 32 startup=5 idle=1
> 
> 
> 
> http_port 3128 intercept
> https_port 3129 intercept ssl-bump cert=/etc/squid3/certs/squid.pem cafile=/etc/squid3/certs/squid.pem key=/etc/squid3/certs/squid.pem  generate-host-certificates=on dynamic_cert_mem_cache_size=4MB sslflags=NO_SESSION_REUSE
> 
> dns_nameservers 192.168.201.1 8.8.8.8
> 
> wccp_version 2
> wccp2_router 192.168.200.73
> wccp2_forwarding_method gre
> wccp2_return_method gre
> wccp2_service standard 0 password=xxxx
> wccp2_service dynamic 70 password=xxxx
> wccp2_service_info 70 protocol=tcp flags=dst_ip_hash priority=240 ports=443
> 
> I did update the ca bundle if that helps. 
> 
> 
> 
> Bruce Markey | Network Security Analyst
> STEINMAN COMMUNICATIONS
> 717.291.8758 (o) | bmarkey at steinmancommunications.com
> 8 West King St | PO Box 1328, Lancaster, PA 17608-1328
> 
> -----Original Message-----
> From: squid-users [mailto:squid-users-bounces at lists.squid-cache.org] On Behalf Of Amos Jeffries
> Sent: Thursday, April 21, 2016 8:59 AM
> To: squid-users at lists.squid-cache.org
> Subject: Re: [squid-users] Cert authority invalid failures.
> 
> On 21/04/2016 8:18 a.m., Markey, Bruce wrote:
>> I'm curious as to why this is happening.
>>
>> Proxy was implemented last week and since then I've been dealing with all the sites that don't work. Not a problem, knew it was going to happen. I'd like to understand why the following is happening.
>>
>>
>> 1.       User goes to https://www.whatever.com
>>
>> 2.       Browser, mostly chrome, gives the following error.   Connection not private. NET:ERR_CERT_AUTHORITY_INVALID
>>
> 
> Typing that into search engine produces a thread explaining that it is the browser message shown when HSTS is in effect on a website and the server cert is not trusted by the browser.
> 
> 
> 
>> 3.       If you view the cert it shows the dynamic cert listed.
>>
>> 4.       Click the "Proceed to www.whatever.com<http://www.whatever.com> (unsafe )
>>
>> 5.       Now I get a squid error.  Requested url could not be retrieved.  Access denied while trying to retrieve https:// some ip address/*
>>
> 
> And that #5 explains why. It was actually not the web server producing the cert. But Squid doing SSL-Bumping in order to show you the error page.
> 
> 
> 
>> Thing is I don't have an acl blocking that ip?   ( Small sub question here, is there a way to tell which acl blocks something? )
>>
> 
> Something clearly is. But not what you expect, or you would not be here asking about it.
> 
>> What I've had to do to get around this is add www.whatever.com<http://www.whatever.com> to my broken_sites.acl.    Then add the ip to an allowed_ips.acl.
>>
>> Then I http_access allow the ips list
>>
>> And skip peeking at the broken site.
>>
>> acl broken_sites ssl::server_name_regex "/etc/squid3/acls/http_broken.txt"
>> ssl_bump peek !broken_sites
>> ssl_bump splice all
>>
>> I'm trying to understand why this is breaking and if I'm doing the right thing in fixing it.
>>
> 
> Please provide your whole squid.conf (except empty or # comment lines).
> We might need to see it all to find what the problem is.
> 
> 
>>
>> The second error I'm getting is:
>>
>>
>> The following error was encountered while trying to retrieve the URL: 
>> https://*.agentimediaservices.com/*<https://%2A.agentimediaservices.co
>> m/*>
>>
>> Failed to establish a secure connection to 63.240.52.151
>>
>> The system returned:
>>
>> (71) Protocol error (TLS code: 
>> X509_V_ERR_UNABLE_TO_GET_ISSUER_CERT_LOCALLY)
>>
>> SSL Certficate error: certificate issuer (CA) not known: 
>> /C=GB/ST=Greater Manchester/L=Salford/O=COMODO CA Limited/CN=COMODO 
>> RSA Organization Validation Secure Server CA Same question.  From what 
>> I've read this means that I don't have the correct root ca?  Is that 
>> correct?  If so is the fix to then go try to find the correct .crt and 
>> add it to the standard ca-cert store? ( I'm on debian so 
>> /usr/share/ca-certificates/Mozilla )
>>
>> Again, is this correct as to what is going wrong and the correct fix?
> 
> Well, first step is to ensure your ca-certificates package is up to date. That usually solves these.
> 
> But not always, especially if the CA has been caught doing bad things and suddenly dropped. Or if they have begun issuing certs to clients before being accepted by the Mozilla CA list people.
> 
> It could also be a problem with intermediary cert just being omitted by the server. In that case adding it to your server-wide cert store or configuring it to be loaded by Squid will be needed.
> 
> Amos
> 
> _______________________________________________
> squid-users mailing list
> squid-users at lists.squid-cache.org
> http://lists.squid-cache.org/listinfo/squid-users
> 



More information about the squid-users mailing list