[squid-users] Ssl-bump deep dive (testing)
James Lay
jlay at slave-tothe-box.net
Fri May 29 00:54:08 UTC 2015
So I took the advice of those here to get explicit working first, so
here's my first attempt. My test environment is Ubuntu 15.04 Server as
the squid server with virtualbox running on it with Kali linux as the
client. Here's my Squid 3.5.4 configure line:
/configure --prefix=/opt --enable-icap-client --with-openssl
--enable-ssl --enable-ssl-crtd --enable-linux-netfilter
--enable-follow-x-forwarded-for --with-large-files
--sysconfdir=/opt/etc/squid --enable-external-acl-helpers=none
Full squid.conf:
#####################################
acl localnet src 192.168.1.0/24
acl SSL_ports port 443
acl Safe_ports port 80
acl Safe_ports port 443
acl CONNECT method CONNECT
http_access allow all
sslproxy_cert_error allow all
sslproxy_cert_error deny all
sslproxy_capath /etc/ssl/certs
sslproxy_flags DONT_VERIFY_PEER
sslproxy_options ALL
sslcrtd_program /opt/libexec/ssl_crtd -s /opt/var/ssl_db -M 4MB
sslcrtd_children 5
http_port 3129 ssl-bump cert=/opt/etc/squid/certs/sslsplit_ca_cert.pem
cafile=/opt/etc/squid/certs/sslsplit_ca_cert.pem
key=/opt/etc/squid/certs/sslsplit_ca_key.pem
generate-host-certificates=on dynamic_cert_mem_cache_size=4MB
sslflags=NO_SESSION_REUSE
external_acl_type sni ttl=30 concurrency=10 children-max=20
children-startup=5 %ssl::>sni /opt/etc/squid/bumphelper.py
acl sni_exclusions external sni
acl tcp_level at_step SslBump1
acl client_hello_peeked at_step SslBump2
ssl_bump peek tcp_level all
ssl_bump splice client_hello_peeked sni_exclusions
ssl_bump bump all
logformat mine %>a %[ui %[un [%tl] "%rm %ru HTTP/%rv" %>Hs %<st %Ss:%Sh
%ssl::bump_mode %ssl::>sni %ssl::>cert_subject
access_log syslog:daemon.info mine
refresh_pattern -i (cgi-bin|\?) 0 0% 0
refresh_pattern . 0 20% 4320
coredump_dir /opt/var
#####################################
bumphelper.py:
#####################################
#!/usr/bin/python
import sys
while True:
req = sys.stdin.readline()
if not req:
break
id, sni = req.split()
sys.stderr.write('request %r\n' % req)
sys.stderr.flush()
if sni == 'google.com': # bypass
sys.stdout.write('{} OK\n'.format(id))
sys.stdout.flush()
else:
sys.stdout.write('{} ERR\n'.format(id))
sys.stdout.flush()
#####################################
The tests:
root at kali:~/test# wget -d https://www.google.com
######################################
DEBUG output created by Wget 1.13.4 on linux-gnu.
URI encoding = `UTF-8'
URI encoding = `UTF-8'
--2015-05-28 17:44:31-- https://www.google.com/
Connecting to 192.168.1.6:3129... connected.
Created socket 4.
Releasing 0x092c6730 (new refcount 0).
Deleting unused 0x092c6730.
---request begin---
CONNECT www.google.com:443 HTTP/1.1
User-Agent: Wget/1.13.4 (linux-gnu)
---request end---
proxy responded with: [HTTP/1.1 200 Connection established
]
---request begin---
GET / HTTP/1.1
User-Agent: Wget/1.13.4 (linux-gnu)
Accept: */*
Host: www.google.com
Connection: Close
Proxy-Connection: Keep-Alive
---request end---
Proxy request sent, awaiting response...
---response begin---
HTTP/1.1 503 Service Unavailable
Server: squid/3.5.4
Mime-Version: 1.0
Date: Thu, 28 May 2015 23:44:33 GMT
Content-Type: text/html;charset=utf-8
Content-Length: 3899
X-Squid-Error: ERR_SECURE_CONNECT_FAIL 32
Vary: Accept-Language
Content-Language: en
X-Cache: MISS from analysis
Via: 1.1 analysis (squid/3.5.4)
Connection: close
---response end---
503 Service Unavailable
URI content encoding = `utf-8'
2015-05-28 17:44:32 ERROR 503: Service Unavailable.
########################################
access.log entry for the above wget:
#####################################
May 28 17:44:33 analysis squid: 192.168.1.91 - - [28/May/2015:17:44:33
-0600] "CONNECT www.google.com:443 HTTP/1.1" 200 0 TAG_NONE:HIER_DIRECT
peek www.google.com -
May 28 17:44:33 analysis squid: 192.168.1.91 - - [28/May/2015:17:44:33
-0600] "GET https://www.google.com/ HTTP/1.1" 503 4242
TAG_NONE:HIER_NONE - www.google.com -
#####################################
sudo /opt/sbin/squid -d 1 -N -f /opt/etc/squid/squid.conf
######################################
2015/05/28 17:44:33| Error negotiating SSL on FD 14:
error:00000000:lib(0):func(0):reason(0) (5/-1/32)
######################################
I see the same type of thing for apple.com and yahoo.com. I'm assuming
this is HSTS, but I could be wrong. MSN however works fine with the
above:
root at kali:~/test# wget -d https://www.msn.com
######################################
DEBUG output created by Wget 1.13.4 on linux-gnu.
URI encoding = `UTF-8'
URI encoding = `UTF-8'
--2015-05-28 18:24:50-- https://www.msn.com/
Connecting to 192.168.1.6:3129... connected.
Created socket 4.
Releasing 0x0a6493c0 (new refcount 0).
Deleting unused 0x0a6493c0.
---request begin---
CONNECT www.msn.com:443 HTTP/1.1
User-Agent: Wget/1.13.4 (linux-gnu)
---request end---
proxy responded with: [HTTP/1.1 200 Connection established
]
---request begin---
GET / HTTP/1.1
User-Agent: Wget/1.13.4 (linux-gnu)
Accept: */*
Host: www.msn.com
Connection: Close
Proxy-Connection: Keep-Alive
---request end---
Proxy request sent, awaiting response...
---response begin---
HTTP/1.1 200 OK
######################################
May 28 18:24:51 analysis squid: 192.168.1.91 - - [28/May/2015:18:24:51
-0600] "CONNECT www.msn.com:443 HTTP/1.1" 200 0 TAG_NONE:HIER_DIRECT
peek www.msn.com -
May 28 18:24:52 analysis squid: 192.168.1.91 - - [28/May/2015:18:24:52
-0600] "GET https://www.msn.com/ HTTP/1.1" 200 38613
TCP_MISS:HIER_DIRECT bump www.msn.com -
######################################
I found that adding %ssl::bump_mode to logging sure helped out with
where I was at in the steps. I also tried the new acl ssl::server_name
instead of using the external helper, but I got the same results with
google, yahoo, and apple. Even setting ssl_bump splice all didn't work
well...it appears that yahoo, google, and apple are peek resistant.
I'll keep digging. Thank you.
James
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.squid-cache.org/pipermail/squid-users/attachments/20150528/e0bb9b50/attachment.html>
More information about the squid-users
mailing list