<div dir="ltr">Hey Jok,<div><br></div><div>Thanks for the suggetion but the big issue with that is i have to download whole repository about ( 80-120 GB ) first and then each week i need to download 20 to 25 GB. We hardly use any of that except few popular repos. big issue i always have with most of them is third party repo's. squid-deb-proxy is quite reliable but again its squid with custom config nothing else and it fails to cache google debs.</div><div><br></div><div>Squid is perfect for me because it can cache things which is requested first time. So next time anybody requests it it's ready. The problem lies when big companies like google and github does not wants us to cache their content and puts various tricks so we can't do that. My issue is same google deb files are downloaded 50 times in same day as apt updates happen and i waste 100s of gb into same content. Country where i live bandwidth is very very costly matter and fast connections are very costly. So this is important for me.</div><div><br></div><div>@Amos,</div><div><br></div><div>I think it's about time Squid needs update of code which can cache use cases like difficult to handle google and github. I am interested to create proposal and will soon share at squid dev and ask for ideas and will try to get official approval so i can build this according to squid standards.</div><div><br></div><div>but before that can you help me with few things.essentially i don't have much experience with C code. as i have worked most of my life with php,python and javascript side. I do know how to write C code but i am not an expert at it. So i want to know if there is any pattern squid follows except the oop pattern. I also want to know workflow of squid i.e. what happens when it receives request and how acls are applied programmatically and how refresh patterns are applied. is there a way i can debug and check if refresh patterns are applied for given url. as well as <span style="font-size:12.8px">reply_header_replace</span> has replaced header if i can see those lines in debug it will help me with this. i know debug options can help me but if i turn it with level 9 it is very difficult to go past so many debug entries.</div><div><br></div><div>My idea is to develop a module which will not change any of the squid code but will be loaded only if its called explicitly within squid config. So i want to know is there any piece of code available within squid which behaves similarly just like your archive mode.</div><div><br></div><div><br></div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Wed, Oct 5, 2016 at 9:49 PM, Jok Thuau <span dir="ltr"><<a href="mailto:jok@spikes.com" target="_blank">jok@spikes.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">This is sort of off-topic, but have you considered using a deb repo mirroring software?<div>(it would mean that you need to update your clients to point to that rather than google, but that's not really difficult). </div><div>software like aptly (<a href="http://aptly.info" target="_blank">aptly.info</a>) are really good about this (though a little hard to get going in the first place). or a deb-caching proxy (apt-cacher-ng? squid-deb-proxy?)<br><div><br></div></div></div><div class="gmail_extra"><br><div class="gmail_quote"><div><div class="h5">On Tue, Oct 4, 2016 at 7:30 AM, Hardik Dangar <span dir="ltr"><<a href="mailto:hardikdangar+squid@gmail.com" target="_blank">hardikdangar+squid@gmail.com</a>></span> wrote:<br></div></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div class="h5"><div dir="ltr">Wow, i couldn't think about that. google might need tracking data that could be the reason they have blindly put vary * header. oh Irony, company which talks to all of us on how to deliver content is trying to do such thing.<div><br></div><div>I have looked at your patch but how do i enable that ? do i need to write custom ACL ? i know i need to compile and reinstall after applying patch but what do i need to do exactly in squid.conf file as looking at your patch i am guessing i need to write archive acl or i am too naive to understand C code :)</div><div><br></div><div>Also </div><div><br></div><div><div style="font-size:12.8px">reply_header_replace is any good for this ?</div></div><div style="font-size:12.8px"><br></div></div><div class="m_1484267884810220441HOEnZb"><div class="m_1484267884810220441h5"><div class="gmail_extra"><br><div class="gmail_quote">On Tue, Oct 4, 2016 at 7:47 PM, Amos Jeffries <span dir="ltr"><<a href="mailto:squid3@treenet.co.nz" target="_blank">squid3@treenet.co.nz</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><span>On 5/10/2016 2:34 a.m., Hardik Dangar wrote:<br>
> Hey Amos,<br>
><br>
> We have about 50 clients which downloads same google chrome update every 2<br>
> or 3 days means 2.4 gb. although response says vary but requested file is<br>
> same and all is downloaded via apt update.<br>
><br>
> Is there any option just like ignore-no-store? I know i am asking for too<br>
> much but it seems very silly on google's part that they are sending very<br>
> header at a place where they shouldn't as no matter how you access those<br>
> url's you are only going to get those deb files.<br>
<br>
<br>
</span>Some things G does only make sense whan you ignore all the PR about<br>
wanting to make the web more efficient and consider it's a company whose<br>
income is derived by recording data about peoples habits and activities.<br>
Caching can hide that info from them.<br>
<span><br>
><br>
> can i hack squid source code to ignore very header ?<br>
><br>
<br>
</span>Google are explicitly saying the response changes. I suspect there is<br>
something involving Google account data being embeded in some of the<br>
downloads. For tracking, etc.<br>
<br>
<br>
If you are wanting to test it I have added a patch to<br>
<<a href="http://bugs.squid-cache.org/show_bug.cgi?id=4604" rel="noreferrer" target="_blank">http://bugs.squid-cache.org/s<wbr>how_bug.cgi?id=4604</a>> that should implement<br>
archival of responses where the ACLs match. It is completely untested by<br>
me beyond building, so YMMV.<br>
<span class="m_1484267884810220441m_-4634714588022039415HOEnZb"><font color="#888888"><br>
Amos<br>
<br>
</font></span></blockquote></div><br></div>
</div></div><br></div></div><span class="">______________________________<wbr>_________________<br>
squid-users mailing list<br>
<a href="mailto:squid-users@lists.squid-cache.org" target="_blank">squid-users@lists.squid-cache.<wbr>org</a><br>
<a href="http://lists.squid-cache.org/listinfo/squid-users" rel="noreferrer" target="_blank">http://lists.squid-cache.org/l<wbr>istinfo/squid-users</a><br>
<br></span></blockquote></div><br></div>
</blockquote></div><br></div>