[squid-users] external acl helper for URI lookups from database

Lukas Yčas lukasycas at gmail.com
Thu Apr 4 01:16:54 UTC 2019


Hello,

Very big fan of Squid software, trying to get to know it better.

A current use case of mine that I require is would be for squid to be able
to block as it currently does via acl url_regex badurl, just that not from
a string inside the configuration file or a file containing the strings on
the OS, but from a MySQL database. (Imagine a simple table = sites, with
rows = badurl1, badurl2, badurl3. If client matches badurl1 they get
blockpage)

1.1) Would this be possible?
1.2) Would this be ''efficient''? (imagining a lot of various traffic and
for each one querying the DB)

And another question -
I'm attempting to write a prototype for a helper in python

ext_py_acl:
#!/usr/bin/python

import sys

while True:
    line = sys.stdin.readline()
    if (line.find("badstring") == -1):
        sys.stdout.write( 'ERR\n' )
    else:
sys.stdout.write( 'OK\n' )

squid.conf:
external_acl_type blockscript %URI /usr/local/squid/libexec/ext_py_acl

2.1) When running this not with squid I pass random strings and it gives me
OK via stdout and if i pass on something with 'badstring' I receive an ERR
- according to all the docs i've read - should work on squid.
What actually happens is the helper processes begin spawning, 1/5, then
another 1/5, until they fill up to 5/5 (tried setting max to 50 - they
filled up to 50) and seem to somehow hang.
When passing -k shutdown to squid I see some weird gibberish logs e.g.:
[Errno 32] Broken pipe[Errno 32] Broken pipesys.stdout.write( 'ERR\n' )

Could someone advise on how to troubleshoot this further and get the
helpers running? Or is there something im lacking here?

Regards,
Lukas
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.squid-cache.org/pipermail/squid-users/attachments/20190404/0e35311c/attachment.html>


More information about the squid-users mailing list