[squid-users] Collecting squid logs to DB

Alex K rightkicktech at gmail.com
Sun May 13 11:22:31 UTC 2018


Thanx Eliezer and Amos for the feedback. I just saw the logformat directive
and will experiment with that.
Yes, I have a small group of users (up to 30 - 40 devices) but the hardware
is a relatively small appliance (4G RAM, 4 cores 2GHz, SSD).

Alex


On Sun, May 13, 2018, 11:37 Eliezer Croitoru <eliezer at ngtech.co.il> wrote:

> To lose the stress on the DB you can use a custom format as Amos suggested
> but..
>
> I think that when you will define and write what you want to log exactly
> you will get what you need and want.
>
>
>
> The general squid access log is pretty lose and I believe that with these
> days hardware the difference will only be seen on systems with thousands or
> millions of clients requests.
>
> If this is a small place it’s not required.
>
>
>
> All The Bests,
>
> Eliezer
>
>
>
> ----
>
> Eliezer Croitoru
> Linux System Administrator
> Mobile: +972-5-28704261
> Email: eliezer at ngtech.co.il
>
>
>
> *From:* Alex K <rightkicktech at gmail.com>
> *Sent:* Sunday, May 13, 2018 01:56
> *To:* Eliezer Croitoru <eliezer at ngtech.co.il>
> *Cc:* squid-users at lists.squid-cache.org
> *Subject:* Re: [squid-users] Collecting squid logs to DB
>
>
>
> +++ Including list +++
>
> Hi Eliezer,
>
> I have used the following lines to instruct squid to log at mariadb:
>
> logfile_daemon /usr/lib/squid/log_db_daemon
> access_log daemon:/127.0.0.1/squid_log/access_log/squid/squid squid
>
> Through testing it seems that sometimes squid is not logging anything. I
> don't know why. After a restart it seems to unblock and write to DB.
>
> The access_log table is currently InnoDB and I am wondering if MyISAM will
> behave better.
>
>
>
> I would prefer if I could have real time access log. My scenario is that
> when a user disconnects from squid, an aggregated report of the sites that
> the user browsed will be available under some web portal where the user has
> access. Usually there will be up to 20 users connected concurrently so I
> have to check if this approach is scalable. If this approach is not stable
> then I might go with log parsing (perhaps logstash or some custom parser)
> which will parse and generate an aggregated report once per hour or day.
>
> Is there a way I format the log and pipe to DB only some interesting
> fields in order to lessen the stress to DB?
>
>
>
>
>
> On Sun, May 13, 2018 at 1:25 AM, Eliezer Croitoru <eliezer at ngtech.co.il>
> wrote:
>
> Hey Alex,
>
>
>
> How did you used to log into the DB? What configuration lines have you
> used?
>
> Also what log format have you used?
>
> Is it important to have realtime data in the DB or a periodic parsing is
> also an option?
>
>
>
> Eliezer
>
>
>
> ----
>
> Eliezer Croitoru
> Linux System Administrator
> Mobile: +972-5-28704261
> Email: eliezer at ngtech.co.il
>
>
>
> *From:* squid-users <squid-users-bounces at lists.squid-cache.org> *On
> Behalf Of *Alex K
> *Sent:* Saturday, May 5, 2018 01:20
> *To:* squid-users at lists.squid-cache.org
> *Subject:* [squid-users] Collecting squid logs to DB
>
>
>
> Hi all,
>
> I had a previous setup on Debian 7 with squid and I was using mysar to
> collect squid logs and store them to DB and provide some browsing report at
> the end of the day.
>
> Now at Debian 9, trying to upgrade the whole setup, I see that mysar does
> not compile.
>
> Checking around I found mysar-ng but this has compilation issues on Debian
> 9 also.
>
> Do you suggest any tool that does this job? Does squid support logging to
> DB natively? (I am using mysql/mariadb)
>
> Some other tool I stumbled on is https://github.com/paranormal/blooper.
>
>
>
> Thanx a bunch,
>
> Alex
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.squid-cache.org/pipermail/squid-users/attachments/20180513/c92005f4/attachment.html>


More information about the squid-users mailing list