[squid-users] Caching http google deb files

Antony Stone Antony.Stone at squid.open.source.it
Wed Oct 5 19:13:21 UTC 2016


On Wednesday 05 October 2016 at 20:40:46, Hardik Dangar wrote:

> Hey Jok,
> 
> Thanks for the suggetion but the big issue with that is i have to download
> whole repository about ( 80-120 GB ) first and then each week i need to
> download 20 to 25 GB.

This is not true for apt-cacher-ng.  You install it and it does nothing.  You 
point your Debian (or Ubuntu, maybe other Debian-derived distros as well, I 
haven't tested) machines at it as their APT proxy, and it then caches content 
as it gets requested and downloaded.  Each machine which requests a new 
package causes that package to get cached.  Each machine which requests a 
cached package gets the local copy (unless it's been updated, in which case 
the cache gets updated).

> We hardly use any of that except few popular repos.
> big issue i always have with most of them is third party repo's.
> squid-deb-proxy is quite reliable but again its squid with custom config
> nothing else and it fails to cache google debs.
> 
> Squid is perfect for me because it can cache things which is requested
> first time. So next time anybody requests it it's ready.

This is exactly how apt-cacher-ng works.  I use it myself and I would 
recommend you investigate it further for this purpose.

> The problem lies when big companies like google and github does not wants us
> to cache their content and puts various tricks so we can't do that.

That's a strange concept for a Debian repository (even third-party).

Are you sure you're talking about repositories and not just isolated .deb 
files?


Antony.

-- 
A user interface is like a joke.
If you have to explain it, it didn't work.

                                                   Please reply to the list;
                                                         please *don't* CC me.


More information about the squid-users mailing list