[squid-users] Collecting squid logs to DB
Eliezer Croitoru
eliezer at ngtech.co.il
Sun May 13 01:21:18 UTC 2018
I have a daemon written in Ruby and GoLang which can do a better job.
Specifically for your scenario I think the better option is to use a tcp server such as in:
https://wiki.squid-cache.org/Features/LogModules#Module:_TCP_Receiver
Ubuntu and Debian use systemd and you can spawn a log daemon which is not related directly to squid.
MyISAM and InnoDB is not a question InnoDB is the only relevant choice for couple reasons from my expirence.
I will try to update you here with the relevant script that you might be able to use for real time logging.
You should be able to use the next line for your purpose:
access_log tcp://127.0.0.1:5000 squid
Eliezer
----
<http://ngtech.co.il/lmgtfy/> Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: eliezer at ngtech.co.il
From: Alex K <rightkicktech at gmail.com>
Sent: Sunday, May 13, 2018 01:56
To: Eliezer Croitoru <eliezer at ngtech.co.il>
Cc: squid-users at lists.squid-cache.org
Subject: Re: [squid-users] Collecting squid logs to DB
+++ Including list +++
Hi Eliezer,
I have used the following lines to instruct squid to log at mariadb:
logfile_daemon /usr/lib/squid/log_db_daemon
access_log daemon:/127.0.0.1/squid_log/access_log/squid/squid <http://127.0.0.1/squid_log/access_log/squid/squid> squid
Through testing it seems that sometimes squid is not logging anything. I don't know why. After a restart it seems to unblock and write to DB.
The access_log table is currently InnoDB and I am wondering if MyISAM will behave better.
I would prefer if I could have real time access log. My scenario is that when a user disconnects from squid, an aggregated report of the sites that the user browsed will be available under some web portal where the user has access. Usually there will be up to 20 users connected concurrently so I have to check if this approach is scalable. If this approach is not stable then I might go with log parsing (perhaps logstash or some custom parser) which will parse and generate an aggregated report once per hour or day.
Is there a way I format the log and pipe to DB only some interesting fields in order to lessen the stress to DB?
On Sun, May 13, 2018 at 1:25 AM, Eliezer Croitoru <eliezer at ngtech.co.il <mailto:eliezer at ngtech.co.il> > wrote:
Hey Alex,
How did you used to log into the DB? What configuration lines have you used?
Also what log format have you used?
Is it important to have realtime data in the DB or a periodic parsing is also an option?
Eliezer
----
Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: eliezer at ngtech.co.il <mailto:eliezer at ngtech.co.il>
From: squid-users <squid-users-bounces at lists.squid-cache.org <http://squid-cache.org> > On Behalf Of Alex K
Sent: Saturday, May 5, 2018 01:20
To: squid-users at lists.squid-cache.org <mailto:squid-users at lists.squid-cache.org>
Subject: [squid-users] Collecting squid logs to DB
Hi all,
I had a previous setup on Debian 7 with squid and I was using mysar to collect squid logs and store them to DB and provide some browsing report at the end of the day.
Now at Debian 9, trying to upgrade the whole setup, I see that mysar does not compile.
Checking around I found mysar-ng but this has compilation issues on Debian 9 also.
Do you suggest any tool that does this job? Does squid support logging to DB natively? (I am using mysql/mariadb)
Some other tool I stumbled on is https://github.com/paranormal/blooper.
Thanx a bunch,
Alex
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.squid-cache.org/pipermail/squid-users/attachments/20180513/389221d0/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image001.png
Type: image/png
Size: 11308 bytes
Desc: not available
URL: <http://lists.squid-cache.org/pipermail/squid-users/attachments/20180513/389221d0/attachment-0001.png>
More information about the squid-users
mailing list