[squid-users] Squid returns 400 to GET / HTTP/1.1 with Host Header

Stephen Nelson-Smith sanelson at gmail.com
Mon Apr 23 15:15:13 UTC 2018


Hello,

I need to demonstrate and test a Squid setup, which should blacklist
by default, and allow requests only to whitelisted URLs from known
networks. This is currently running in my staging environment, and is
working as expected, but I want to test and demo it on demand, with
nicer feedback than with curl.

I've deployed Redbot (https://github.com/mnot/redbot), which I've set
up to send all HTTP requests via the Squid proxy

Using curl -x from the Redbot machine, all my tests pass, but using
the application, Squid returns a 400 whatever happens. All requests go
to Squid, and I see every request, but instead of returning a 403 and
X-Squid-Error: ERR_ACCESS_DENIED 0, or allowing the request, every
request gets a 400, and X-Squid-Error: ERR_INVALID_URL 0.

Digging into it - logs and tcpdump - the key difference I see is that
Redbot sends a request of the form:

GET / HTTP/1.1
Host: chess.com

Curl sends:

GET http://chess.com/ HTTP/1.1
Host: chess.com

>From the RFC it seems like Redbot's request is perfectly valid, and so
I feel like Squid should do the right thing and deduce from the host
header what Redbot wants, and go through its ACLs. However, it just
errors with:

HTTP/1.1 400 Bad Request
Server: squid/3.5.27
Mime-Version: 1.0
Date: Mon, 23 Apr 2018 11:50:23 GMT
Content-Type: text/html;charset=utf-8
Content-Length: 3465
X-Squid-Error: ERR_INVALID_URL 0
X-Cache: MISS from proxy.redaction.com
Via: 1.1 proxy.redaction.com (squid/3.5.27)

Does this seem like a Squid config issue? Or do I need to make Redbot
make a request like Curl does?


More information about the squid-users mailing list