[squid-users] ssl bump and url_rewrite_program (like squidguard)

Edouard Gaulué edouard at e-gaulue.com
Thu Nov 12 09:03:40 UTC 2015


Hi Marcus, Amos and maybe others,

Here were I am. I've looked in the log. Let me describe what I observe. 
It's maybe linked with some other posts I've read.

Imagine I try to connect to http://ad.doubleclick.net/ad.jpg. I observe 
the request in wireshark. It goes to the squid process: there is no SSL 
involved so no bump. Squidguard sends its redirect to squid and Squid 
sends to the browser :
HTTP/1.1 302 Found
Server: squid/3.5.10
Date: Wed, 11 Nov 2015 22:49:44 GMT
Content-Length: 0
Location: 
https://proxyweb.xxx.xxx/cgi-bin/squidGuard-simple.cgi?clientaddr=xxx&clientgroup=low-ip&targetgroup=adv&url=http://ad.doubleclick.net/ad.jpg
X-Cache: MISS from squid
X-Cache-Lookup: MISS from squid:3128
Via: 1.1 squid (squid/3.5.10)
Connection: keep-alive

The browser next sends a request to proxyweb: everything is fine.


Now let's add some SSL stuff and connect to 
https://ad.doubleclick.net/ad.jpg. I observe the request in wireshark. 
It goes to the squid process with SSL. Squidguard sends its redirect to 
squid and Squid sends to the browser :
HTTP/1.1 302 Found
Server: squid/3.5.10
Date: Wed, 11 Nov 2015 22:49:44 GMT
Content-Length: 0
Location: 
https://proxyweb.xxx.xxx/cgi-bin/squidGuard-simple.cgi?clientaddr=xxx&clientgroup=low-ip&targetgroup=adv&url=http://ad.doubleclick.net/ad.jpg
X-Cache: MISS from squid

The browser next tries to send a direct request to 
https://ad.doubleclick.net that is never ending as this machine hasn't 
got the right to go on the Internet.

Why is the REDIRECT not the same with and without SSL? It looks it 
disturbs the navigator (at least Mozilla and IE).

I can also provide squid logs, but tell me what because I've got a lot...

Regards, EG


Le 05/11/2015 14:01, Marcus Kool a écrit :
>
>
> On 11/04/2015 08:55 PM, Edouard Gaulué wrote:
>> Hi Marcus,
>>
>> Well that just an URL rewriter program. You can just test it from the 
>> command line :
>> echo "URL" | /usr/bin/squidGuard -c /etc/squidguard/squidGuard.conf
>>
>> Before I understood it was possible to precise the redirect code I 
>> got that:
>> #> echo
>> "https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386? 
>>
>> - - GET"|/usr/bin/squidGuard -c /etc/squidguard/squidGuard.conf
>> #> OK
>> rewrite-url="https://proxyweb.XXXXX.XXXXX/cgi-bin/squidGuard-simple.cgi?clientaddr=-pipo&clientname=&clientuser=&clientgroup=default&targetgroup=unknown&url=https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?" 
>>
>>
>>
>> After a little change in the squidguard.conf, I get:
>> #> OK status=302
>> url="https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=-pipo&clientname=&clientuser=&clientgroup=default&targetgroup=unknown&url=https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?" 
>>
>
> This looks fine, so now you need to look at Squid and set the debug 
> options to find out what it is doing.
>
> Note that squidGuard does not percent-escape the URL parameter as it 
> should (see RFC 3986).
> This is, however, most likely not the cause of the issue that you are 
> seeing.
>
> Marcus
>
>>
>> It's not so better handled by my browser saying "can't connect to 
>> https://ad.doubleclick.net" message. But, I don't get the squid 
>> message anymore regarding http/https.
>>
>> It may be that rewrite_rule_program come after peek and splice stuff 
>> leading squid to an unpredictable situation. Is there a way to play 
>> on order things happen in squid?
>>
>> Regards, EG
>>
>>
>> Le 04/11/2015 14:10, Marcus Kool a écrit :
>>> You need to know what squidGuard actually sends to Squid.
>>> squidGuard does not have a debug option for this, so you have to set
>>>    debug_options ALL,1 61,9
>>> in squid.conf to see what Squid receives.
>>> I bet that what Squid receives, is what it complains about:
>>> the URL starts with 'https://http'
>>>
>>> Marcus
>>>
>>> On 11/04/2015 10:55 AM, Edouard Gaulué wrote:
>>>> Le 04/11/2015 11:00, Amos Jeffries a écrit :
>>>>> On 4/11/2015 12:48 p.m., Marcus Kool wrote:
>>>>>> I suspect that the problem is that you redirect a HTTPS-based URL 
>>>>>> to an
>>>>>> HTTP URL and Squid does not like that.
>>>>>>
>>>>>> Marcus
>>>> To give it a try in that direction I now redirect to an https 
>>>> server. And I get :
>>>>
>>>> The following error was encountered while trying to retrieve the 
>>>> URL: https://https/*
>>>>
>>>>     *Unable to determine IP address from host name "https"*
>>>>
>>>> The DNS server returned:
>>>>
>>>>     Name Error: The domain name does not exist.
>>>>
>>>>
>>>> Moreover this would leads sometimes to HTTP-based URL to an HTTPS 
>>>> URL and I don't know how much squid likes it either.
>>>>
>>>>> No it is apparently the fact that the domain name being redirected 
>>>>> to is
>>>>> "http".
>>>>>
>>>>> As in:"http://http/something"
>>>>>
>>>> I can assure my rewrite_url looks like 
>>>> "https://proxyweb.xxxxx.xxxxx/var1=xxxx&...".
>>>>
>>>> And this confirm ssl_bump parse this result and get the left part 
>>>> before the ":". To play with, I have also redirect to 
>>>> "proxyweb.xxxxx.xxxxx:443/var1=xxxx&..." (ie. I removed the 
>>>> "https://" and add a
>>>> ":443") to force the parsing. Then I don't get this message 
>>>> anymore, but Mozilla gets crazy waiting for the ad.doubleclick.net 
>>>> certificate and getting the proxyweb.xxxxx.xxxxx one. And of course it
>>>> breaks my SG configuration and can't be production solution.
>>>>> Which brings up the question of why you are using SG to block 
>>>>> adverts?
>>>>>
>>>>> squid.conf:
>>>>>   acl ads dstdomain .doubleclick.net
>>>>>   http_access deny ads
>>>>>
>>>>> Amos
>>>>>
>>>>>
>>>> I don't use SG to specificaly block adverts, I use it to block 90 % 
>>>> of the web. Here it's just an example with ads but it could be with 
>>>> so much other things...
>>>>
>>>> I just want to try make SG and ssl_bump live together.
>>>>
>>>> Is this possible to have a rule like "if it has been rewrite then 
>>>> don't try to ssl_bump"?
>>>>
>>>> Regards, EG
>>>>
>>>>
>>>> _______________________________________________
>>>> squid-users mailing list
>>>> squid-users at lists.squid-cache.org
>>>> http://lists.squid-cache.org/listinfo/squid-users
>>>>
>>
>>
>>
> _______________________________________________
> squid-users mailing list
> squid-users at lists.squid-cache.org
> http://lists.squid-cache.org/listinfo/squid-users



More information about the squid-users mailing list