<html>
<head>
<meta content="text/html; charset=utf-8" http-equiv="Content-Type">
</head>
<body text="#000000" bgcolor="#FFFFFF">
<br>
<br>
<div class="moz-cite-prefix">28.10.15 16:47, Amos Jeffries пишет:<br>
</div>
<blockquote cite="mid:5630A7D6.7040505@treenet.co.nz" type="cite">
<pre wrap="">On 28/10/2015 11:35 p.m., Yuri Voinov wrote:
</pre>
<blockquote type="cite">
<pre wrap="">Hi gents.
I think, all of you who use Bump, seen much this messages in your
cache.log.
SSL3_READ_BYTES:sslv3 alert certificate unknown
AFAIK, no way to identify which CA is absent in your setup.
I propose to consider the following questions: how do properly support
SSL proxy, if you can not identify the problem certificates? Telepaths
sunbathing in Bali. The procedure, which currently can not quickly and
in any way to effectively determine such a certificate.
At the moment, the situation is as follows. SSL library - a thing in
itself, it runs by itself and does not write any logs. Squid - itself
and any useful information on the library does not receive but obscure
diagnostic messages. The possibility in any way specify the SSL library
diagnostic messages we have, and, as I understand it, will not.
So, any ideas?
</pre>
</blockquote>
<pre wrap="">
Make sure Squid is sending the whole CA chain to the remote end?</pre>
</blockquote>
I think so, "From the remote end". If we have web-server with CA,
which is not exists on our proxy, we must install it (which means
"trust them", yea?) in our proxy manually.<br>
<br>
I have idiotic idea - Squid fetch remote CA and offer us to trust
and install interactively. :) This is, of course, clinically
idiotism. :)<br>
<br>
But - to support real Squid installation with thoursands users, I
really want to know, which CA's not exists from my side.<br>
<br>
Intermediate CA's is no matter - if we have root CA already, fetch
intermediate chain is not big problem. <br>
<br>
In this case, however, we faced unknown root CA exactly.<br>
<br>
Yes?<br>
<br>
And so what?<br>
<br>
Yea, I can kick all users, watch huge access.log, trying to identify
problem URL row by row, execute curl/wget.<br>
<br>
And?<br>
<br>
Do this procedure every day?<br>
<br>
<span id="result_box" class="short_text" lang="en"><span class="hps">This
is not the</span> <span class="hps">best</span> <span
class="hps">a waste of time</span><span>.<br>
</span></span><span class="hps"><br>
Of course, the</span> <span class="hps">OpenSSL</span> <span
class="hps">developers</span> <span class="hps">have to</span> <span
class="hps">tear off</span> <span class="hps">his hands</span><span>.</span>
<span class="hps">But what about</span> <span class="hps">us</span><span>,</span>
<span class="hps">smart and</span> handsome? ;)<br>
<div id="gt-res-content" class="almost_half_cell">
<div dir="ltr" style="zoom:1"><span id="result_box" class=""
lang="en"><span class="hps"><br>
</span></span></div>
</div>
<br>
<blockquote cite="mid:5630A7D6.7040505@treenet.co.nz" type="cite">
<pre wrap="">
Amos
_______________________________________________
squid-users mailing list
<a class="moz-txt-link-abbreviated" href="mailto:squid-users@lists.squid-cache.org">squid-users@lists.squid-cache.org</a>
<a class="moz-txt-link-freetext" href="http://lists.squid-cache.org/listinfo/squid-users">http://lists.squid-cache.org/listinfo/squid-users</a>
</pre>
</blockquote>
<br>
</body>
</html>