[squid-dev] Use MAX_URL for first line limitation
Eduard Bagdasaryan
eduard.bagdasaryan at measurement-factory.com
Thu Jun 7 22:13:38 UTC 2018
Hi Squid developers,
There is a bug in Squid when using %>ru log format code for
relatively large URLs of size between MAX_URL=8K and 64K. In this
case, a dash "-" is logged, whereas another, similar %ru logs the
URL (or its prefix), as expected. We found a solution for this bug,
introducing a new AccessLogEntry virgin URL field, and using it
inside Format::assemble() for LFT_CLIENT_REQ_URI (when
request is nil). However, this solution itself is
insufficient: with it, in %>ru Squid logs large and small URLs
differently. For example, Squid strips whitespaces from small URLs,
while keeping them for large ones.
We considered two possible solutions:
1. Adjust urlParse() to process large URLs (instead of returning when
noticing an URL exceeds MAX_URL). This solution can be
problematic, because the parsed URL can even exceed 64K limit (e.g.,
after applying rfc1738_unescape()).
2. Adjust Http::One::RequestParser::parseRequestFirstLine(), immediately
rejecting requests with URLs exceeding MAX_URL (use it instead of
Config.maxRequestHeaderSize). As a result, access.log would get
"error:request-too-large" for such requests. This solution looks
better, since anyway, Squid eventually denies such requests and,
moreover, there are many contexts in Squid tied up to that MAX_URL
limitation.
So, the question is: can we go on with (2) without breaking something?
Or probably there are any other(better) alternative approaches?
Thank you,
Eduard.
More information about the squid-dev
mailing list