[squid-users] decreased requests per second with big file size
ambadas.tdna at gmail.com
Wed Oct 14 06:00:48 UTC 2015
Its mostly like a live feed.
I am writing these sites+(a client tracking parameter) to a flat file via
squid, from where another process reads it & does further processing (eg.
analyze top sites used by any particular client).
And that is why i was working on getting just the urls entered by clients.
On Tue, Oct 13, 2015 at 2:01 PM, Eliezer Croitoru <eliezer at ngtech.co.il>
> Hey Ambadas,
> I was wondering if you want it to be something like a "live feed" or just
> for logs analyzing?
> On 09/10/2015 15:47, Ambadas H wrote:
>> I am using below setup:
>> Squid proxy 3.5.4.
>> CentOS 7.1
>> I am trying to analyze the most used websites by the users via Squid
>> I just require the first GET request for that particular browsed page page
>> & not the proceeding GETs of that same page.
>> 1) user enters *http://google.com <http://google.com>* in client
>> 2) client gets page containing some other urls
>> 3) client initiates multiple GETs for same requested page without users
>> I myself tried a logic where I assumed if "Referer" header is present,
>> its not the first GET but a proceeding one for same requested page.
>> I know i cant rely on "Referer" header to be always present as its not
>> mandatory. But
>> I want to know if my logic is correct? & also if there's any alternative
>> squid-users mailing list
>> squid-users at lists.squid-cache.org
> squid-users mailing list
> squid-users at lists.squid-cache.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the squid-users