[squid-users] Peer2Peer Url categorizing, black\white lists, can it work?

Eliezer Croitoru eliezer at ngtech.co.il
Tue Jul 26 13:43:42 UTC 2016


Thanks Amos,

The concept is simple and easy to implement but it is not maintained anymore.
The url http://gremlin.ru/soft/drbl/en/zones.html is broken :\

I have also seen: RiskIQ -> https://en.wikipedia.org/wiki/RiskIQ
And a dnsmasq blacklist: https://github.com/britannic/blacklist
And a reverse proxy idea: https://github.com/marinhero/goxy

In any case it's not like DHT and similar ideas.
The drbl has a very solid concept but lacks couple concepts compared to what I was thinking about.

Currently I have a client for public rbls such as Symantec and OpenDNS.
And this is nice example code that handles dns blacklist queries in golang: https://github.com/jersten/ipchk
(for me to remember later)

I will try to calculate couple things and then I will move on.

Eliezer

----
Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: eliezer at ngtech.co.il


-----Original Message-----
From: squid-users [mailto:squid-users-bounces at lists.squid-cache.org] On Behalf Of Amos Jeffries
Sent: Tuesday, July 26, 2016 2:45 PM
To: squid-users at lists.squid-cache.org
Subject: Re: [squid-users] Peer2Peer Url categorizing, black\white lists, can it work?

On 26/07/2016 3:14 p.m., Eliezer Croitoru wrote:
> I have it on my plate for quite some time and I was wondering about the
> options and interest in the subject.
> 
> Intro:
> Currently most free blacklists are distributed in the old fashion way of a
> tar or other file.
> There are benefits to these but I have not seen an option to be able to
> "help" each other.
> For example many proxy servers "knows" about a domain that other do not.
> So even if the site exists and know in one side of the planet it's not in
> another.
> If it could be categorized or white\black listed in one side of the planet
> why we cannot help each other?
> Many admins adds sites to their DB and list but not many share them
> publically.
> 
> The idea:
> As an example Google and Mozilla services advertise malware infected sites
> using their browser.
> Many filtering solutions uses their clients logs to inspect and enhance
> their lists.
> There are many distributed key+value DB systems such as etcd and many others
> DHT based.
> I believe that somehow a url categorizing and black\white lists can be
> advertised in a similar way.
> The only limit is the "bootstap" or the "routers" of such a network.
> Since such a service should only apply to KEYS and values which today should
> not exceed 1MB I believe it would be pretty simple to create networks based
> on that.
> Once a network category or scheme can be defined it would be pretty simple
> to "match" or "connect" between the relevant nodes.
> 
> Currently I am looking at the different options for the backend DB,
> permissions and hierarchy which should give an admin a nice start point.
> Such "online network" can be up and running pretty fast and it can enhance
> the regular categories and lists to be more up-to-date.
> Else then the actual categorizing and listing I believe that it would be
> possible to share and generate a list of public domains which are known
> compared to the current state which many parts of the web is "unknown".


I suggest you look into how DRBL works,
<http://gremlin.ru/soft/drbl/en/faq.html>. The distributed blacklist
design was created by the anti-spam community as both a protection for
maintainers against legal threats to list administrators, and to provide
resistance against individual nodes disappearing for any other reason.
 That system would allow immedate linkup with some existing Rpublic
blacklists like SURBL, which lists websites used by spammers for malware
or phish hosting.

All thats needed in Squid would be an ACL to do the lookups.

Amos

_______________________________________________
squid-users mailing list
squid-users at lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users



More information about the squid-users mailing list